Dynamic Cache for Computing with High-Performance

Introduction

This dynamic, reconfigurable multi-level cache technology is designed to enhance the performance of multi-purpose and heterogeneous computing architectures. By dynamically adapting to the system’s current workload, this cache system ensures optimized data retrieval, reduces latency, and increases overall computational efficiency. This technology is particularly valuable for companies involved in computing, AI, semiconductor design, and cloud services, offering a solution that meets the increasing demand for faster, more efficient computing systems.

The Challenge: Handling Diverse Computing Workloads

In today’s computing landscape, systems must handle a variety of workloads, from AI computations to data-heavy tasks, while maintaining high performance and energy efficiency. Traditional static cache designs are often insufficient to manage the demands of heterogeneous computing systems, leading to bottlenecks, increased power consumption, and reduced performance. As the complexity of data processing grows, the need for adaptable cache systems that can dynamically adjust to different workloads becomes critical.

Dynamic Cache for Efficient Processing

This dynamic, reconfigurable cache system solves these issues by automatically adjusting the cache structure and size based on real-time workload demands. This flexibility improves data access speed, minimizes latency, and optimizes power usage, resulting in a more efficient computing architecture. The technology is ideal for applications ranging from high-performance computing to AI and cloud-based services, where varying workloads require quick adaptability without sacrificing performance.

Key Benefits for Computing and AI Industries

For semiconductor companies, this technology provides a way to improve processor designs by integrating dynamic caching capabilities that enhance multi-core processing performance. AI and machine learning companies can leverage this system to accelerate data processing and improve algorithm efficiency, leading to faster results and better system optimization. Cloud service providers will find value in the cache’s ability to optimize performance across diverse virtualized environments, ensuring consistent and reliable service for users.

Investing in Future-Ready Computing

Licensing this dynamic cache for computing technology positions your company at the forefront of innovation in high-performance computing and data processing. By offering a flexible, reconfigurable cache system, your business can deliver next-generation processing solutions that meet the demands of modern computing architectures. This technology ensures faster, more efficient computing performance, reducing latency and improving energy efficiency across a range of applications.

Embodiments of a system for dynamic reconfiguration of cache are disclosed. Accordingly, the system includes a plurality of processors and a plurality of memory modules executed by the plurality of processors. The system also includes a dynamic reconfigurable cache comprising of a multi-level cache implementing a combination of an L1 cache, an L2 cache, and an L3 cache. The one or more of the L1 cache, the L2 cache, and the L3 cache are dynamically reconfigurable to one or more sizes based at least in part on an application data size associated with an application being executed by the plurality of processors. In an embodiment, the system includes a reconfiguration control and distribution module configured to perform dynamic reconfiguration of the dynamic reconfigurable cache based on the application data size.

We claim:

1. A system for dynamic reconfiguration of cache, the system comprising:

a plurality of processors;
a plurality of memory modules executed by the plurality of processors;
a dynamic reconfigurable cache comprising of a multi-level cache implementing a combination of an L1 cache, an L2 cache, and an L3 cache; and
a reconfiguration control and distribution module configured to receive input including an application data size, a cache sizing type, and at least one distribution factor,
wherein one or more of the L1 cache, the L2 cache, and the L3 cache are dynamically reconfigurable to one or more sizes based at least in part on the application data size associated with an application being executed by the plurality of processors, and
wherein the reconfiguration control and distribution module is further configured to perform dynamic reconfiguration of the dynamic reconfigurable cache based on the input.
2. The system as claimed in claim 1, wherein relative maximum loadable sizes (N1, N2, N3) of the L1 cache, the L2 cache, and the L3 cache respectively satisfy N3>N2>N1.
3. The system as claimed in claim 1, wherein the dynamic reconfigurable cache is configured to track the application data size to dynamically reconfigure an association and a replacement policy for the dynamic reconfigurable cache.
4. The system as claimed in claim 1, wherein the dynamic reconfigurable cache is configured to provide an adaptable cache association to cache sizing for L1, L2 and L3 caches respectively.
5. The system as claimed in claim 1, wherein the dynamic reconfigurable cache is configured to provide an adaptable cache replacement policy for L1, L2 and L3 caches.
6. The system as claimed in claim 1, wherein one or more cache memory cells in the dynamic reconfigurable cache are distributed from a higher cache level to a lower cache level by connecting one or more levels of the multi-level cache.
7. The system as claimed in claim 1 further comprises a reconfigurable interconnection configured to connect one or more levels of multi-level cache with one or more other levels of multi-level cache to distribute one or more cache memory cells from a higher cache level to a lower cache level in the dynamic reconfigurable cache.
8. The system as claimed in claim 6, wherein L2 loaded cache capacity is distributed to expand L1 loaded cache by a first distribution factor
1k⁢1
and L3 loaded cache capacity is distributed to expand L2 loaded cache by a second distribution factor
1k⁢2.

9. A computer-implemented method for reconfiguration of a multi-level cache memory, the method comprising:

in a system comprising one or more processors coupled to the multi-level cache memory,
determining, in run-time, a current application data size associated with an application being executed by the one or more processors;
deriving dynamically a cache sizing for one or more levels of multi-level cache memory based on a maximum loadable capacity of each cache level of the multi-level cache memory and the determined current application data size;
loading cache memory cells of the multi-level cache memory based on the derived cache sizing to obtain reconfigured cache sizing;
performing cache association for the one or more levels of multi-level cache memory based on the reconfigured cache sizing;
applying one or more cache replacement policies for the one or more levels of multi-level cache memory based on the reconfigured cache sizing; and
generating a cache association output vector that comprises a loaded capacity, a block size, an association type and a cell size per loaded size range for one or more cache levels in the multi-level cache memory.
10. The method as claimed in claim 9 further comprises extracting the current application data size from an application profile.
11. The method as claimed in claim 9 further comprises tracking dynamic application performance to enable or disable a continuous tracking of the current application data size.
12. The method as claimed in claim 9 further comprises distributing the cache sizing from a higher level cache to the next lower level cache in the multi-level cache memory.
13. The method as claimed in claim 12 further comprises obtaining a plurality of distribution factors, wherein the distributing of the cache sizing is based at least in part on the obtained plurality of distribution factors.
14. The method as claimed in claim 12, wherein the higher level cache has a faster memory technology as compared to the next lower level cache of the multi-level cache memory.
15. The method as claimed in claim 9, wherein performing cache association comprises providing a set allocation connectivity and a block allocation connectivity for cache memory cells in the multi-level cache memory to achieve a desired cache association.

16. The method as claimed in claim 9 further comprising:

generating a set connectivity routing table and a block connectivity routing table for each level of multi-level cache memory; and
driving, for each level of the multi-level cache memory, cache memory cells set allocation and block allocation connections to achieve a desired cache association for each level of the multi-level cache memory.

17. The method as claimed in claim 16 further comprises:

obtaining a set allocation connectivity vector and a block allocation connectivity vector for each level of the multi-level cache memory; and
generating a connectivity table for each level of the multi-level cache memory for set and block allocation of cache memory cells for achieving desired cache association corresponding to each level of the multi-level cache.

18. A system for dynamic reconfiguration of cache, the system comprising:

a dynamic reconfigurable multi-level cache; and
a reconfiguration control and distribution module configured to receive input including an application data size, a cache sizing type, and at least one distribution factor, wherein the reconfiguration and distribution module is further configured to dynamically reconfigure sizing of the dynamic reconfigurable multi-level cache based at least in part on an application data size associated with an application being executed.
19. The system as claimed in claim 18, further comprising a reconfigurable interconnection configured to create a distributed multi-level cache by organizing a plurality of cache memory cells from a higher level cache into a lower level cache for expansion of the lower level cache in the dynamic reconfigurable multi-level cache.

20. The system as claimed in claim 18, further comprising:

a cache association output vector driven by the reconfiguration control and distribution module and comprising a loaded capacity, a block size, an association type and a cell size per loaded size range for one or more cache levels in the multi-level cache memory.

Share

Title

Dynamic reconfigurable multi-level cache for multi-purpose and heterogeneous computing architectures

Inventor(s)

Khalid ABED, Tirumale RAMESH

Assignee(s)

Jackson State University

Patent #

11372758

Patent Date

June 28, 2022

Inquire about this intellectual property

Learn more about "Dynamic Cache for Computing with High-Performance"