Top 10 List of Week 05

Jonathan Nicholas --- Jakarta

Top 10 List of Week 05

  1. OS Demand Paging (Article)
    This is a nice overall overview of this week’s material from javatpoint. It briefly explains concepts like Page Fault and Thrashing. Page fault occurs when an instruction needs a page that isn’t in the main memory. It has the same concept as a cache miss. Thrashing, if you’ve read my compilation in w04, you’ll know that it’ll hapen when the main memory is full which causes page swapping to occur and take more time than the actual instruction.

  2. What is copy-on-write? (Article)
    Again, a stackoverflow thread/blog which you usually get the most information for the time invested in reading. This week’s first unfamiliar term for me is copy-on-write, so I thought this would be the perfect opportunity to familiarize myself with this term. copy-on-write or COW is a strategy used for use cases such as multiple requests for the same resouce. These resources are indistinguishable (perhaps checked using checksums?) and therefor can be directed to the same resource. This is maintained until someone does a write and therefor the modified document is copied so the person who modified it will get a different version than the other requesters which still has the previous version. This creates an efficient yet private somewhat file management system. Really interesting read, I recommend.

  3. Pro vs. Hard Coding Interview Problem - LRU Cache (LeetCode Day 24) (Video)
    I’ve previously mentioned LRU cache in w03, as one of the topics is about caching. This time it’s a video tutorial about LRU cache by Errichto, one of the best in Competitive Programing. This video is a must watch as it displays how a seasoned programmer tackles an unknown problem by understanding the problem, think about the right data structure, struggle, and then solving the problem. Basically an LRU cache is an implmentation of a Hash Map / Hash Table which usually uses Linked Lists for collision resolution. You should learn about the basic commands of an LRU cache which is to get from the cache and also put. The action of get and put is what determines an item being the Least recent, and therefore being erased from the cache.This is such a great interview question and I highly recommend.

  4. When and Why to use a Least Frequently Used (LFU) cache with an implementation in Golang (Blog)
    This blog is a hidden gem which describes deeply about Least Frequently Used or LFU cache by Ilija. Ilija divides this article into bite-sized sections such as why use LFU cache, data structure used, and structs/types needed. You can think of LFU as a Content Delivery Network or CDN, which is widely used by services where users requests files frequently. Memory is limited, therefore, to avoid cache misses, CDNs use LFU cache algorithm to keep the frequently used files and remove the less frequent ones. The data structured used, same as an LRU cache is hash maps and possibly linked lists if you implement your own hash table. This is such rich article, and the effort shows as he/she inclused diagrams and code snippets with the explanation. As a bonus, you’ll be able to understand golang from this article, such a nice read.

  5. Operating System - Virtual Memory (Article)
    To sum up the previous materials, this article is a nice overview of topics such as Demand Paging, reference string, and the various page replacing algorithms such as FIFO, Optimal Page, LRU, Page Buffering, LFU, and MFU. Most of these I have explained in the top of this week’s list. One thing that I’d like to familiarize myself with is Optimal Page Algorithm, which claims to have the lowest page-fault out of all algorithms. This algorithm exists and is called OPT or MIN, which I might take a deeper dive in the next numbers on the list. The basic premise of optimal page algorithm is to replace the page that has the biggest amount of time in which it has not been accessed in, so I guess it should be called Last Used cache algorithm?

  6. Mock System Design Interview - Build a system like TikTok (SDE 1 level) (Video)
    Now this is a fun video for you folks in search of a SDE internship. It uses lots of concepts from previous week, such as Distributed File Systems and Caching which is used by a Content Delivery Network. It is an interesting link and more of a higher overview of Operating System knowledge, whereas mroe often than not we’ve been on the low level of the topics. It also touches future topics such as concurrency with message queues. If you’re an aspiring software developer, then one of the skils to have is to design a system. This is a nice practice to do to test out topics such as file systems and caching in an architectural scenario.

  7. How to Monitor RAM Usage on Linux
    On a mac or windows, it’s easy to check your RAM usage just by clicking on the respective activity monitor application, but in linux this might not be the case. This article is jam-packed with information such as top and free commands. The free command is used to see a system’s current memory utilization and is more human-readable. top command does the same thing but in a more detailed manner, that is to display memory utilization of every process on the system. This article is also nice enough to explain the different flags of every command as well as a new htop command which you need to install but has the best of free and top command!. Definitely a must read.

  8. OS Local versus Global Allocation Policies (Article)
    This article is interesting because it might not look the pretiest, but surprisingly has rich contents which also expands through other important topics such as page faults, page replacement algorithm, and memory paging. It also has examples with scenarios so you’ll have a deeper understanding on what’s happening in the background. It takes an example of LRU algorithm when a process triggers a page fault. The main takeaway is the global algorithm dynamically allocates page frames amongst the runnable processes, whereas the local algorithm allocate a fixed fraction of the memory to every process. A very nice read to strengthen your OS knowledge and test your fundamentals.

  9. Will More RAM Make your PC Faster?? (2020) (Video)
    This is the mandatory Linus Tech Tips video. Surprisingly, it touches on concepts such as Page Files which is discussed in this week’s topic about memory virtualization. To sum up, 4gb is unusable in today’s standard, 8gb is nice for day-to-day usage, 16gb-32gb if you have a pretty heavy worload such as editing, compiling. Above that, you’ll probably don’t need it, unless you’re using it for deep learning.

  10. top command in Linux with Examples (Article)
    So if you’ve done this week’s assignment, you should be familiar with the top command, but still confused on what it does and it’s properties. This article explains nicely the components and different processes and what it does. From this article, I learned PID is the process’ unique id, shr is shared memory, PR is priority, virt is virtual memory used, and many more!.


© 2021-2021 --- Jonathan Nicholas --- File Revision: 1.0.0---01-Mar-2021.