cache in computer


 The cache in a computer or your processor has become a very important part of today's computing. The cache is a very high speed and very visited memory, used to speed up the memory retrieval process. Due to its expensive processors come with a relatively small amount of cache for the main system memory. Budget processors have even less cache, this is the main way that leading processor manufacturers take the cost out of their budget processors.


How does the CPU Cache work?

Without the cache memory, each time the CPU requested data it will send a request to the main memory which will then send back across the memory bus to the processor. This is a slow process in terms of computing. 


The idea of ​​the cache in the computer is that this extremely fast memory will store and data that come to it often and also if possible the data that is around it. This is to achieve the fastest possible response time to the processor. 


It's based on the percentage game. If we request a particular piece of data 5 times in the past, this specific piece of data will likely be required again and thus stored in the cache.


What is processor cache memory? Let's take a library as an example of how the cache works. Imagine a large library but with only one librarian (standard CPU setting). The first person enters the library and asks for the Lord of the Rings. 


The librarian follows the path to the bookshelves (Memory Bus) retrieves the book and gives it to the person. The book is returned to the library upon completion. Now without a cache, the book will be returned to the shelf. When the next person arrives and asks for the Lord of the Rings, the same process happens and takes the same amount of time.


What does cache mean in computer terms? If this library had a cache system, once the book was returned, the cache definition computer, it would be on a shelf at the librarians' table. 


This way once the other person comes in and asks for the Lord of the Rings, the librarian just has to reach for the shelf and retrieve the book. This significantly reduces the time required to retrieve the book. Back to computing, it is the same idea, the cached data is collected much faster. The computer uses its logic to determine which data comes in the most often and keeps those books on the shelf so to speak.


It is a single level caching system used in most hard drives and other components. However, processors use a two-tier cache system. 


The principles are the same. The level 1 cache is the fastest and smallest memory, a level 2 cache is large and slightly slower but still smaller and faster than the main memory. If we go back to the library, when the Lord of the Rings is returned this time it will be stored on the shelf. This time the library is busy and lots of other books are returned and the shelf fills up quickly. 


The Lord of the Rings has not taken out some time so he takes it off the shelf and puts it in a bookcase behind the table. The bookcase is still closer than the rest of the library and still quick to reach. Now when the next person asks for the Lord of the Rings, the librarian will first look at the shelf and see that the book is not there. They will then proceed to the bookcase to see if the book is there. 


It's the same for processors. They first check the L1 cache and then check the L2 cache for the data they need.


Is more Cache always better?

The answer is mostly yes but definitely not always. The main problem with having too much cache is that the processor cache will always check the cache before the main system memory. Looking again at our library as an example. 


If 20 different people enter the library, after various books that have not been published for some time, but the library has been busy in the past so the shelf and bookcase are full we have a problem. Each time a person requests the book the librarian will check the shelf and then check the bookshelf before he realizes that the book should be in the main library. 


The librarian each time runs around to fetch the book from the library. If this library had a non-cache system, in fact, it would have been faster in this case because the librarian would have gone straight to the book in the main library instead of checking the shelf and the bookcase.


As a matter of fact that non-cache systems only work under certain circumstances and so in some applications, the processors are definitely better with a decent amount of cache memory in computer architecture. Applications such as MPEG encoders are not good cache users because they have a constant stream of completely different data.


Does cache only store frequently accessed data?

If there is space in the cache, it will store data that is close to the data that is commonly accessed. 


Looking back to our library. If the first person of the day enters the library and takes out the Lord of the Rings, what is cache memory in the computer, the intelligent librarian may place the Lord of the Rings Part B on the shelf. 


In this case, when the person returns the book, what is cache in the computer? there is a good chance that they will ask for the Lord of the Rings Part II. Because it will happen more times than not. It was well worth it for the librarian to go get the second part of the book in case it was required.


Cache Hit and Cache Miss

Caching and Missing a Cache are just simple terms for exactly what goes into the CPU cache. If the processor accesses its cache and searches for data, it will find it or it is unusual. If the CPU finds what follows it is called a cache hit. 


If it needs to go to the main memory to find it then it is called caching. The percentage of adjustments to the total cache in computer requests is called the hit rate. You want to get it as high as possible for the best performance.

Post a Comment

Previous Post Next Post