Gain Some IQ on AI

Today (1/8/18) NetApp announced a new partnership with NVIDIA and launched the NetApp ONTAP AI Proven Architecture. This strengthens their already growing foothold in this new and exciting branch of the IT industry and after what was announced today, ONTAP AI is surely going to have everyone talking. This meet in the channel play gives data scientists a proven architecture to use in their data pipeline for deep learning, avoiding design guesswork and allows for fast efficient deployments of AI environments.

Machine learning (ML) and artificial intelligence (AI) have some very unique demands from an IT perspective. Firstly, they both have a demand for huge amounts of information; a capacity requirement that is constantly growing. Second, they require that storage to respond with an ultra-low latency. Unlike big data you need to keep all the data generated and not burn the hay to find the needle so expandability over time is a must. And finally, the type of computation that they undertake is more suited to a GPU rather than a CPU.

Now whether you would class this as a modernise your infrastructure or a next generation Data Centre play one thing is certain this is definitely cutting-edge equipment. For example by using one NVIDIA DGX-1 is equivalent to replacing 400 traditional servers and if you look at Gartner’s top 10 picks for 2018 and beyond, the majority of these have an aspect of AI/ML in there so it’s probably only natural that we are seeing IT vendors moving into this space.

NetApp are announcing the ability to combine an AFF800A their flagship All Flash Array with Five NVIDIA DGX-1 with Tesla V100’s tied together over 100Gbe with a pair of nexus 3232C’s from Cisco which equates to 5000TFLOPS

Whilst the messaging around this offering highlights it as a future-proof play you don’t need to buy everything in one go; but instead build upon NetApp’s key messages of flexibility and scaling. But if you were to plan ahead or really did need to start big, there is no reason you could not have a twelve High Availability pair with sixty (60x) DGX-1 with close to 75PB. There is also no reason you couldn’t implement a data pipeline with an A700s or even A300 or A220 it all really depends on what performance and scalability you require. Tie this together with edge devices running ONTAP select for data ingest and then the ability to use Cloud Volumes ONTAP in the AWS or Azure or possibly FabricPool for an archival tier you can truly see why integrating the Data Fabric into this story is such a nice fit. Just imagine adding MAX Data into this mix and it will be like strapping on two F9 first stage boosters to this already Full Thrust rocket.

Now you may be thinking this is super computer niche corner case but in reality, it is being utilised in pretty much every industry vertical affecting almost every aspect of our daily lives from the finance industry to heath, automotive, retail, agriculture, oil and gas and even legal industries to name a few are already seeing a surge in software and companies dedicated to this as a way of doing business. We have the horror stories of Facebook and no doubt you have invested in one of the big three home automation voice recognition featuring Alexa, Siri or Assistant. Maybe you have travelled using Uber or Tesla’s autopilot or even Waze on your phone. Maybe you have a hobby like flying drones from DJI or utilise 3DR’s software, or you can’t work out without your Fitbit or Fenix, the point is you are providing data back to some central point that is analysed to give the company better decisions as to what to proceed to market with as a next generation product or where to improve something already in the field. Whist the luddites worry that AI will lead to Skynet and the doom of humanity it is probably better think of it in an advancement in human intelligence and another milestone down the path of evolution, and I look forward to seeing how this architecture develops.