NetApp is probably not a name you might be familiar with, unless you are an IT technician or specialist for your company. If you are; great, you know what they do and how they serve the commercial market. If you are not familiar with their name; let us enlighten you.
NetApp, in its simplest, most general description, is a hybrid cloud service provider. To put it plainly NetApp is sort of a solution to manage its clients data in either a physical server format, or a cloud format. To be completely accurate though they provide their clients with more than just data management in whichever space they choose. They provide companies with something they call Data Fabric. Now the question becomes; “what is Data Fabric”?
We will revisit the explanation of Data Fabric in detail (as explained by NetApp’s South East Asian CTO) on a later date. For now you would have to make do with our explanation of what Data Fabric is.
Data Fabric in is plainest looking form is basically a data management solution that links and transfers multiple data strands and clusters across different storage types and locations. It is sort of like building a high-speed expressway for data to be transferred between two storage locations and types. It bridges data across multiple platforms for whatever sort of application that the client needs to run. If this sounds over-simplified; that is because it is.
For now though that explanation would have to do. We had a chance to sit down with NetApp’s CTO, Matthew Hurford to have a chat about their latest collaboration plan with their long time client, MAMPU. The discussion covers the topic that every other government and organisations talk about – Smart Cities. In the first part of a two-part interview series we spoke to Matthew about the concept of Smart Cities itself and Malaysia’s vision of a smart city, here.
In this second and last part of the two-part interview series we focused on Netapp’s capabilities and contribution to the Smart City Initiative. We also explored the role and contribution of Artificial Intelligence (AI) in the larger initiative as well.
NetApp for Smart Cities
NetApp, as mentioned, is a data storage and access specialist. In that sense they provide, in the simplest way to put it, data storage capabilities. Beyond that though they offer something called data efficiency as well. What happens in that is that NetApp helps its clients, like MAMPU to store less data to save more spaces in their server space. It sounds quite ironic, especially when NetApp mentioned that data is the core of a smart city. That may be the case but there is a possibility of duplicated data that might cause inefficient use of space and that is what NetApp is trying to avoid. NetApp’s solution also compresses valuable data collected without losing its significant integrity to be used by its users. That way NetApp helps its clients to save on infrastructure costs.
The best part of all this though is that NetApp allows its users to access these data from everywhere. Technically its clients may not even need to have a physical infrastructure on site to access and use the data. NetApp allows its users capabilities to pull up the data from any place, any time, quickly at that. It sounds pretty much like an extranet service, except that you get full access of the Petabytes of data that a city might collect into its data lake remotely.
We visited the concept of Open Data, a concept that NetApp thinks might be the center of a smart city. The concept is simple enough to understand as well. It technically is allowing access to nearly anyone that might contribute to the data lake to the lake itself. In that sense anyone with access to the data lake can use the data to create something out of the data. That set of data can also be used to improve decision making if users have to. In a sense, the NetApp Open Data concept can be likened to an open source system; except that it is not exactly like that.
An open sourced system like Red Hat poses a problem or a set of codes in the public space. Anyone from the public space can take and use the codes to solve their own problems or modify the codes to solve problems posed by Red Hat. The Open Data system is plenty more general than that. It does not contain any form of codes for software systems, rather data collected by multiple sensors in multiple environments. These data is used then to either make sense of an incident, analyse patterns, and then indirectly form solutions.
The question that stemmed from there is obviously about security. If everyone has access too the data lake, what about security and privacy? How save are we? According to Matthew data that is going to be ultimately used in a smart city environment is mostly data extracted by sensors rather than anything from the internet. Technically speaking the breach to private data should be at a minimal.
Despite saying that NetApp themselves has their own data encryption systems and their built-in security measures. They have been working with many governments and military organisations and have kept these data completely safe. Obviously though there is a the problem of human error. NetApp does everything they can to prevent any data breaches or data corruption. They even brief their clients on security and awareness of it.
If clients choose to work with other vendors as well NetApp can work with their chosen vendors. NetApp has that capability of integrating their services with any other security vendors that their client chooses. With Data Fabric as well NetApp can even be integrated to things like Amazon Web Services, Azure, and the likes. This flexibility with API plugins does help their clients familiarise with their services quickly and as painlessly as possible as well.
Source: NetApp South East Asia
Also published on Medium.