The bridging of new technology gap has been very challenging for Intel and almost three years later after the reveal of 3D Xpoint memory, the company finally came up with Optane DC and define it as an “entirely new class” of data storage and technology.
Optane DC works different from traditional DRAM and runs on DIMM. This innovation will actually bridge the gap DRAM and secondary memory storage by increasing the amount of memory that is available on the CPU to as much of 3TB. The memory stick simple fits into the motherboard of the PC. The main point being made by the company is stability even when the system is not running in full power.
The device will be available in three different storage capacities of 128GB, 256GB and 512GB. The only system that will be supporting these storage devices include Xeon processors, the modern upgraded ones. According to Intel what makes this memory stick so special is the fact that it is cost-effective and performs comparatively well so far. Thus being labelled as the “best database solution”.
Intel promises upto 9.4 times boost in performance of operations per second by basically decreasing latency. This is done by storing data closer to the processor decreasing the restart time from literally minutes to seconds. The highly encrypted data will be saved in the hardware. Not only this, the memory stick offers faster analysis of data and more server occurrences.
“Unlike traditional DRAM, Intel Optane DC persistent memory will offer the unprecedented combination of high-capacity, affordability and persistence. By expanding affordable system memory capacities (greater than 3 terabytes per CPU socket), end customers can use systems enabled with this new class of memory to better optimize their workloads by moving and maintaining larger amounts of data closer to the processor and minimizing the higher latency of fetching data from system storage,” writes Lisa Spelman, Vice President & General Manager Intel Xeon products and Data Center Marketing at Intel.
Intel is promoting availability to app developers over availability to the masses to exploit major chunks of the newly introduced memory in an efficient and better way.