Will we ever see a world where there are no longer gasoline-fueled cars on the road? Of course! Why spend money on gas and pollute the planet when you can save mother earth and spend less? Ironically, gasoline as energy will become extinct, just like the dinosaurs that produced it.

Will we ever see a world where users won’t care which database system their data came from, but instead query as quickly as doing a browser search over the internet? Of course!  Why spend time and money moving data to a single platform to have a single version of the truth when the truth can reside across many data sources?  Ironically, users with no technical background can now join data across platforms thousands of times faster than today’s best data scientists.

Virtualization is the ability to join data no matter where it resides without any technical expertise, and it is as inevitable in data as electricity is to cars!

Tom Coffing, the world’s leading expert on Virtualization, is having a webinar this Thursday, January 21st, at 1:00 EST to demonstrate Virtualization live.

As there are only a few that have ventured to build electric cars, even fewer have taken on the impossible task of Virtualization.  When asked why this holy grail of data is the road never traveled, Tom Coffing says, “Database vendors each speak a proprietary language to make it difficult for customers to churn.  Providing an environment where data flows freely among systems has taken close to 20 years of overcoming near-impossible obstacles.”

The first obstacle is creating a user-friendly query tool that can simultaneously query every single database platform on the market. It is of great challenge to build a query tool for a single database, let alone one that works across all platforms.  The one positive is that all database platforms can use SQL as a query language.

The next obstacle is converting table structures and data types between databases.  The possibilities are endless, and even the best data scientists have spent 30 days to convert 30 tables between systems.  No matter what two databases a user selects, 30 tables need to convert in 30 seconds.

One of the most challenging obstacles is moving data across networks, requiring mastering the load utility languages from each vendor, automating over 100 different ways to transfer data, equivalent to an individual learning 100 other languages and speaking them fluently.

If by chance, you have the wherewithal, the tenacity, and the funding to overcome these challenges, you are only halfway there.  You now need to provide a graphical user interface that visually shows tables. When the user checkmarks the columns for their result set, they want the SQL to build automatically.  Automating SQL building itself is problematic because although database platforms use SQL, the millions of tiny differences are a burden that must overcome.

The easy part is building a tool that can query all systems, convert table structures and data types in seconds, master each vendor’s load utility language, show tables visually, and automate SQL building. You must now bring it all together with two pieces of genius that make the impossible possible.

The first piece of genius is to allow cross-system joining of data by combining all previous accomplishments. Still, it would help if you enable the user to process the join on the system where it makes the most sense.  Data comes in all sizes, so processing the join is not a one size fits all scenario.  The user must have the option of joining the data on any on-premises database system, cloud system, an available server, or on their PC or laptop. We call the processing of choice system the Hub.

Now that your invention is nearing the peak of Mt. Everest, you realize that your decades of work is for naught if you cannot overcome the last challenge.  The positive news is that you have overcome many obstacles to allow a user to point-and-click to move data across platforms to join.  But what is the bad news?

The bad news is that when a user commands data to move from their PC or laptop, data must flow through that same PC or laptop.  The data flows from the source system through the PC and onto the target system, killing any idea of moving large amounts of data.  The user is often outside of the company firewall working at home or on the road, which means the data path is a local area network on a PC with minimal memory.

The network and PC bottleneck is like finding that your electric car works, except the battery explodes when the driver goes over 45 miles per hour.  You’ve just fallen off of Mt. Everest, and this is going to leave a mark.

It is then and only then that you decide if you are one of those who won’t take no for an answer, refuse to quit, and believe there is always a way to make lemonade out of your expensive lemon!

There is only one path forward, and the solution becomes clear.  You must allow the user to do everything from their PC or laptop, but you must provide an option to let a Server initiate and execute the work on a high-speed network.  This Server is to the data highway what a cell tower is to a cellphone.

It took almost five years to develop the coordination between the user’s workstations and the Server. Still, now a company can place multiple Servers on-premises and on every cloud—the closer the Server to the database platforms, the faster and more efficient the flow of data.

And that is how the most incredible tool in data was born.  The names are the Nexus Pro Desktop and the Nexus Server.

It no longer matters where the user is working from or where the data resides.  The user points-and-clicks and a single button press can join billions of rows across dozens of systems on a data highway built for speed.

Virtualization is the future of data, and some companies will find benefits nearly a billion dollars!

I have a partnership with two leaders in the industry that put the icing on the cake.  Soterosoft is the leader in encrypting sensitive data, which allows a company to provide access across a universe of data but protecting sensitive data.

The other vital component is my partnership with Yellowbrick, which has the most modern data warehouse technology today. Yellowbrick provides a database with blistering performance, works on-premises or on any cloud, moves data at world-record speeds, and can have thousands of users query billions of rows in subsecond time.  The Yellowbrick advantage brings a Virtualization Hub to an entirely new level.

Save your seat at our webinar this Thursday, January 21st, at 1:00 EST (10:00 Pacific time) at the link below.

https://www.yellowbrick.com/go/virtualization-at-your-fingertips/?utm_campaign=demo-webinar&utm_source=coffingdw&utm_medium=email