The end of the year is a good time to both be thankful and to remind us all that we face a future of immensely complex challenges. These challenges we face in our global future are many. Increased population. Ending hunger. Universal education. Sustainable energy. Clean water. Global security. Safe megacities. Gainful employment and improving health. To name a few.
I am not a pessimist about these global challenges but rather reporting that this is our shared destiny--to find global solutions to meet the challenges of our future is what our civilization has done for millennia. We adapt, innovate and invent to survive. This will require that we invent jobs, solutions, innovations faster. We need to collapse time. To do this we need the next generation of computers to deliver the "what's next" much faster than ever before.
As a global futurist I care about shaping better futures. All of my clients do as well. Many innovations in technology have proven up to be quite useful in creating better futures. But frankly we need faster solutions to keep pace with the growing challenges. And many of the ways to meet these challenges must deploy a new generation of thinking machines, predictive analytics, big data, synthetic biology, 3D makers, smarter robotics. We need to do better.
According to a recent forecast by Gartner, by 2020, the number of connected devices will reach 20.8 billion. The amount of data this represents in voice, email, GPS, DNA is beyond our capacity to store or capture today. We cannot do the big data to big insight discovery process without new tools to process data.
We need a faster than Moore's Law technology to keep pace. The existing computing platforms do not have the bandwidth and capabilities to process, store, manage, transmit and secure the big data we need to solve problems. And the gargantuan Data Tsunami coming, driven by the Internet of Things, mobile devices, enterprise applications and digital transformation of the planet--cannot be managed without new computing tools. You see the dilemma.
We need to accelerate invention--Innovation Velocity. We need not just more powerful or smarter computers, we need an entirely new computing architecture that can address big complex global challenges. That is the breakthrough that we need to shape the future. Enter Memory-Driven Computing.
I am excited by Hewlett Packard Enterprise's Memory-Driven Computing because it appears to be a breakthrough, based on a proof of concept just revealed, in creating a new paradigm in computing based on memory. Memory-Driven Computing is an entirely new architecture with memory, not the processor, at the core of computing platform. Also a photonics fabric, a new way to accelerate and enable high performance communications has been proven up to show speeds that are much faster, then anything we have today in computing. The new metric of Memory-Driven Computing can process in 500 nanoseconds 64 terabytes of data.
What are the advantages? We could be looking at creating a new energy source, ending diseases, increasing agriculture production, enterprise productivity, climate change forecasting, financial modeling, cybersecurity or meeting other global challenges. We could be inventing better futures to also solve challenges in space travel, understand Black Holes or the next fusion reactor for clean energy production.
When it comes to the enterprise and entrepreneurs well, Memory-Driven Computing could create a new dynamic global marketplace of services, inventions and applications. This could enable innovation faster, exponentially faster than anything we have seen. The future needs new ideas and a new generation of entrepreneurs and companies that are ready to leverage Innovation Velocity.
If you want to know where the next jobs are, what careers are coming and how to beat out that robot for your job--check out Memory-Driven Computing for where innovation will be going this coming year and beyond.