Category Archives: Contributed

2022 and Beyond – Technologies that will Change the Dialogue

We are living in a do-anything-from-anywhere economy enabled by an exponentially expanding data ecosystem. It’s estimated 65% of Global GDP will be digital next year (2022). This influx of data presents both opportunities and challenges. After all, success in our digital present and future relies on our ability to secure and maintain increasingly complex IT systems. Here I’ll examine both near-term and long-term predictions that address the way the IT industry will deliver the platforms and capabilities to harness this data to transform our experiences at work, home and in the classroom.  

What to look for in 2022:  

The Edge discussion will separate into two focus areas – edge platforms that provide a stable pool of secure capacity for the diverse edge ecosystems and software defined edge workloads/software stacks that extend application and data systems into real world environments. This approach to Edge, where we separate the edge platforms from the edge workloads, is critical since, if each edge workload creates its own dedicated platform, we will have proliferation of edge infrastructure and unmanageable infrastructure sprawl.

woman using a computer
Photo by cottonbro on Pexels.com

Imagine an edge environment where you deploy an edge platform that presents compute, storage, I/O and other foundational IT capacities in a stable, secure, and operationally simple way. As you extend various public and private cloud data and applications pipelines to the edge along with local IoT and data management edges, they can be delivered as software-defined packages leveraging that common edge platform of IT capacity. This means that your edge workloads can evolve and change at software speed because the underlying platform is a common pool of stable capacity.

We are already seeing this shift today. Dell Technologies currently offers edge platforms for all the major cloud stacks, using common hardware and delivery mechanisms. As we move into 2022, we expect these platforms to become more capable and pervasive. We are already seeing most edge workloads – and even most public cloud edge architectures – shift to software-defined architectures using containerisation and assuming standard availably of capacities such as Kubernetes as the dial tone. This combination of modern edge platforms and software-defined edge systems will become the dominant way to build and deploy edge systems in the multi-cloud world.

The opening of the private mobility ecosystem will accelerate with more cloud and IT industries involved on the path to 5G. Enterprise use of 5G is still early. In fact, today 5G is not significantly different or better than WiFi in most enterprise use cases. This will change in 2022 as more modern, capable versions of 5G become available to enterprises. We will see higher performance and more scalable 5G along with new 5G features such as Ultra Reliability Low Latency Communications (UR-LLC) and Massive Machine Type Communicators (mMTC), with dialogue becoming much more dominant than traditional telecoms (think: open-source ecosystem, infrastructure companies, non-traditional telecom).

signal tower
Photo by Miguel Á. Padriñán on Pexels.com

More importantly we expect the ecosystem, delivering new and more capable private mobility, will expand to include IT providers such as Dell Technologies but also public cloud providers and even new Open-Source ecosystems focused on acceleration of the Open 5G ecosystem.

Edge will become the new battleground for data management as data management becomes a new class of workload. The data management ecosystem needs an edge. The modern data management industry began its journey on public clouds processing and analysing non-real-time centralised data. As the digital transformation of the world accelerates, it has become clear that most of the data in the world will be created and acted on outside of centralised data centers. We expect that the entire data management ecosystem will become very active in developing and utilising edge IT capacity as the ingress and egress of their data pipelines but will also utilise edges to remotely process and digest data.

As the data management ecosystem extends to the edge this will dramatically increase the number of edge workloads and overall edge demand. This correlates to our first prediction on edge platforms as we expect these data management edges to be modern software-defined offerings. Data management and the edge will increasingly converge and reinforce each other. IT infrastructure companies, like Dell Technologies, have the unique opportunity to provide the orchestration layer for edge and multi-cloud by delivering an edge data management strategy.

The security industry is now moving from discussion of emerging security concerns to a bias toward action. Enterprises and governments are facing threats of greater sophistication and impact on revenue and services. At the same time, the attack surface that hackers can exploit is growing based on the accelerated trend in remote work and digital transformation. As a result, the security industry is responding with greater automation and integration. The industry is also pivoting from automated detection to prevention and response with a focus on applying AI and machine learning to speed remediation. This is evidenced by industry initiatives like SOAR (Security Orchestration Automation & Response), CSPM (Cloud Security Posture Management) and XDR (Extended, Detection and Response). Most importantly we are seeing new efforts such as the Open Secure Software Foundation in the Linux Foundation ramp up the coordination and active involvement of the IT, telecom and semiconductor industries.

close up view of system hacking
Photo by Tima Miroshnichenko on Pexels.com

Across all four of these areas – edge, private mobility, data management and security – there is a clear need for a broad ecosystem where both public cloud and traditional infrastructure are integrated. We are now clearly in a multi-cloud, distributed world where the big challenges can no longer be solved by a single data center, cloud, system or technology.

What to look for beyond 2022:

Quantum Computing – Hybrid quantum/classical compute will take center stage providing greater access to quantum.  In 2022 we expect two major industry consensuses to emerge. First, we expect the industry will see the inevitable topology of a quantum system will be a hybrid quantum computer where the quantum hardware or quantum processing units (QPU) are specialised compute systems that look like accelerators and focus on specific quantum focused mathematics and functions. The QPUs will be surrounded by conventional compute systems to pre-process the data, run the overall process and even interpret the output of the QPUs.

Early real-world quantum systems are all following this hybrid quantum model and we see a clear path where the collaboration of classical and quantum compute is inevitable. The second major consensus is that quantum simulation using conventional computing will be the most cost effective and accessible way to get quantum systems into the hands of our universities, data science teams and researchers. In fact, Dell and IBM already announced significant work in making quantum simulation available to the world.

Automotive The automotive ecosystem will rapidly shift focus from a mechanical ecosystem to a data and compute industry.  The automotive industry is transforming at several levels. We are seeing a shift from Internal Combustion Engines to Electrified Vehicles resulting in radical simplification of the physical supply chain. We are also seeing a significant expansion of software and compute content within our automobiles via ADAS and autonomous vehicle efforts. Finally, we are seeing the automotive industry becoming data driven industries for everything from entertainment, to safety to major disruptions such as Car-as-a-Service and automated delivery.

All of this says that the automotive and transportation industries are beginning a rapid transition to be driven by software, compute and data. We have seen this in other industries such as telecom and retail and in every case the result is increased consumption of IT technology. Dell is actively engaged with most of the world’s major automotive companies in their early efforts, and we expect 2022 to continue their evolution towards digital transformation and deep interaction with IT ecosystems. 

jonas leupe 81DQcYCS8sQ unsplash
Photo by Jonas Leupe on Unsplash

Digital Twins – Digital Twins will become easier to create and consume as the technology is more clearly defined with dedicated tools. While gaining in awareness, digital twins is still a nascent technology with few real examples in production. Over the next several years, we’ll see digital twins become easier to create and consume as we define standardised frameworks, solutions and platforms. Making digital twin ideas more accessible will enable enterprises to provide enhanced analytics and predictive models to accelerate digital transformation efforts. Digital twin adoption will become more mainstream with accelerated standardisation and availability of solutions and framework, bringing deployment and investment costs down. Digital twins will be the core driver of Digital transformation 3.0 combining measured and modeled/simulated worlds for direct business value across industry verticals.

As a technology optimist, I increasingly see a world where humans and technology work together to deliver impactful outcomes at an unprecedented speed. These near-term and long-term perspectives are based on the strides we’re making today. If we see even incremental improvement, there is enormous opportunity to positively transform the way we work, live and learn and 2022 will be another year of accelerated technology innovation and adoption.

Six Edge Computing Trends to Watch in 2022

While many aspects of edge computing are not new, the overall picture continues to evolve quickly. For example, “edge computing” encompasses the distributed retail store branch systems that have been around for decades. The term has also swallowed all manner of local factory floor and telecommunications provider computing systems, albeit in a more connected and less proprietary fashion than was the historical norm.

However, even if we see echoes of older architectures in certain edge computing deployments, we also see developing edge trends that are genuinely new or at least quite different from what existed previously. These trends are helping IT and business leaders solve problems in industries ranging from telco to automotive, for example, as both sensor data and machine learning data proliferates.

Edge computing trends that should be on your radar

Here, edge experts explore six trends that IT and business leaders should focus on in 2022:

1. Edge workloads get fatter

One big change we are seeing is that there is more computing and more storage out on the edge. Decentralized systems have often existed more to reduce reliance on network links than to perform tasks that could not practically be done in a central location assuming reasonably reliable communications. But, that is changing.

server racks on data center
Photo by Brett Sayles on Pexels.com

IoT has always involved at least collecting data almost by definition. However, what could be a trickle has now turned into a flood as the data required for machine learning (ML) applications flows in from a multitude of sensors. But, even if training models are often developed in a centralized data centre, the ongoing application of those models is usually pushed out to the edge of the network. This limits network bandwidth requirements and allows for rapid local action, such as shutting down a machine in response to anomalous sensor readings. The goal is to deliver insights and take action at the moment they’re needed.

2. RISC-V gains ground

Of course, workloads that are both data- and compute-intensive need hardware on which to run. The specifics vary depending upon the application and the tradeoffs required between performance, power, cost, and so forth. Traditionally the choice has usually come down to either something custom, ARM, or x86. None are fully open, although ARM and x86 have developed a large ecosystem of supporting hardware and software over time, largely driven by the lead processor component designers.

But RISC-V is a new and intriguing open hardware-based instruction set architecture.

Why intriguing? Here’s how Red Hat Global Emerging Technology Evangelist Yan Fisher puts it: “The unique aspect of RISC-V is that its design process and the specification are truly open. The design reflects the community’s decisions based on collective experience and research.”

This open approach, and an active ecosystem to go along with it, is already helping to drive RISC-V design wins across a broad range of industries. Calista Redmond, CEO of RISC-V International, observes that: “With the shift to edge computing, we are seeing a massive investment in RISC-V across the ecosystem, from multinational companies like Alibaba, Andes Technology, and NXP to startups like SiFive, Esperanto Technologies, and GreenWaves Technologies designing innovative edge-AI RISC-V solutions.”

3. Virtual Radio Access Networks (vRAN) become an increasingly important edge use case

A radio access network is responsible for enabling and connecting devices such as smartphones or internet of things (IoT) devices to a mobile network. As part of 5G deployments, carriers are shifting to a more flexible vRAN approach whereby the high-level logical RAN components are disaggregated by decoupling hardware and software, as well as using cloud technology for automated deployment and scaling and workload placement.

pexels-photo-6200343.jpeg
Photo by Z z on Pexels.com

Hanen Garcia, Red Hat Telco Solutions Manager, and Ishu Verma, Red Hat Emerging Technology Evangelist, note that “One study indicates deployment of virtual RAN (vRAN)/Open RAN (oRAN) solutions realize network TCO savings of up to 44% compared to traditional distributed/centralized RAN configurations.” They add that: “Through this modernization, communications service providers (CSPs) can simplify network operations and improve flexibility, availability, and efficiency—all while serving an increasing number of use cases. Cloud-native and container-based RAN solutions provide lower costs, improved ease of upgrades and modifications, ability to scale horizontally, and with less vendor lock-in than proprietary or VM-based solutions.”

4. Scale drives operational approaches

Many aspects of an edge-computing architecture can be different from one that’s implemented solely within the walls of a data centre. Devices and computers may have weak physical security and no IT staff on-site. Network connectivity may be unreliable. Good bandwidth and low latencies aren’t a given. But many of the most pressing challenges relate to scale; there may be thousands (or more) network endpoints.

Kris Murphy, Senior Principal Software Engineer at Red Hat, identifies four primary steps you must take in order to deal with scale: “Standardize ruthlessly, minimize operational ‘surface area,’ pull whenever possible over push, and automate the small things.”

For example, she recommends doing transactional, which is to say atomic, updates so that a system can’t end up only partially updated and therefore in an ill-defined state. When updating, she also argues that it’s a good practice for endpoints to pull updates because “egress connectivity is more likely available.” One should also take care to limit peak loads by not doing all updates at the same time.

5. Edge computing needs attestation

With resources at the edge tight, capabilities that require little to no local resources are the pragmatic options to consider. Furthermore, any approach needs to be highly scalable or otherwise, the uses and benefits become extremely limited. One option that stands out is the Keylime project. “Technologies like Keylime, which can verify that computing devices boot up and remain in a trusted state of operation at scale should be considered for broad deployment, especially for resource-constrained environments” as described by Ben Fischer, Red Hat Emerging Technology Evangelist.

roonz nl 2xEQDxB0ss4 unsplash
Photo by RoonZ.nl on Unsplash

Keylime provides remote boot and runtime attestation using Integrity Measurement Architecture (IMA) and leverages Trusted Platform Modules (TPMs) which are common to most laptop, desktop, and server motherboards. If no hardware TPM is available, a virtual, or vTPM, can be loaded to provide the requisite TPM functionality. Boot and runtime attestation is a means to verify that the edge device boots to a known trusted state and maintains that state while running. In other words, if something unexpected happens, such as a rogue process, the expected state would change, which would be reflected in the measurement and would take the edge device offline, because it entered an untrusted state. This device could be investigated and remediated and put back into service again in a trusted state.

6. Confidential Computing becomes more important at the edge

Security at the edge requires broad preparation. Availability of resources, such as network connectivity, electricity, staff, equipment, and functionality vary widely but are far less than what would be available in a data centre. These limited resources limit the capabilities for ensuring availability and security. Besides encrypting local storage and connections to more centralized systems, confidential computing offers the ability to encrypt data while it is in use by the edge computing device.

​​This protects both the data being processed and the software processing the data from being captured or manipulated. Fischer argues that “confidential computing on edge computing devices will become a foundational security technology for computing at the edge, due to the limited edge resources.”

According to the Confidential Computing Consortium’s (CCC) report by the Everest group, Confidential Computing – The Next Frontier in Data Security, “Confidential computing in a distributed edge network can also help realize new efficiencies without affecting data or IP privacy by building a secure foundation to scale analytics at the edge without compromising data security.” Additionally, confidential computing “ensures only authorized commands and code are executed by edge and IoT devices. Use of confidential computing at the IoT and edge devices and back end helps control critical infrastructure by preventing tampering with code of data being communicated across interfaces.“

Confidential computing applications at the edge range from autonomous vehicles to collecting sensitive information.

Diverse applications across industries

The diversity of these edge computing trends reflects both the diversity and scale of edge workloads. There are some common threads – multiple physical footprints, the use of cloud-native and container technologies, an increasing use of machine learning. However, telco applications often have little in common with industrial IoT use cases, which in turn differ from those in the automotive industry. But whatever industry you look at, you’ll find interesting things happening at the edge in 2022.

The Cloud and the Opportunity Ahead

A lot of what we do now is underpinned by the cloud, and “cloud” has increasingly become a tech buzzword. There are many reasons there is buzz around the cloud, and I will expand on some of them here.

shutterstock image
Photo by Redd Angelo on StockSnap

Cloud democratises access to the kind of computing power that was previously only accessible to large corporations with deep pockets. What used to require a $100 million investment can now be achieved on the cloud for as little as $26 a year. And, by not spending time and resources on traditional IT infrastructure, companies using the cloud can build faster, better, and cheaper – in more sustainable ways. Cloud is flexible, agile, scalable, and has the potential to impact all industries in ways that were unimaginable just a few years ago across healthcare, finance, agriculture, education, and sustainability, to name a few. And as the demand for cloud computing grows, so does the demand for cloud-skilled workers. It has been predicted that there will be a significant skills gap by 2025 unless more is done to train, retrain, and upskill the region’s workforce.

Driving digital transformation and harnessing data

In today’s digital economy, it’s hard to find an industry that doesn’t use cloud applications. From accelerating medical research, improving crop yields in developing economies, and driving sustainability, to tracking bush fires, the cloud is changing the way we live, work, and play. Digital transformation is both an agent of change and a facilitator of it, and some of the biggest disruptions have been in the banking sector as we change the way we bank. There are more than 50 digital banks across Asia, with more on the way, helping drive financial inclusion in developing countries using the cloud. Today’s digital bank customers have high expectations for convenience, enhanced user experience, and personalisation, and access to the cloud has enabled these banks to innovate to meet these demands quickly and at low cost.

The pandemic has accelerated disruption and cloud adoption, and the volume of data produced as industries move to the cloud is growing rapidly. This data holds the potential for insights that can inform business strategies and is a resource that can’t be ignored. While some businesses are already leveraging data to drive decisions, gain competitive advantage, and fuel the next generation of innovation and success, more will do so in the coming year as business leaders start to understand the potential that cloud computing presents.

pexels lukas 574071 1
Photo by Lukas from Pexels

Data and analytics will become this decade’s priorities, and we must be ready with the necessary tools, skills, and expertise to tap into this resource to deliver efficiency and unlock experimentation. For many organisations, data is their most valuable asset, and we are helping them move data to the cloud, modernise applications, build next-generation secure data platforms, and build data lakes to collect real-time data. And, using Machine Learning (ML) algorithms, these organisations can gain real-time actionable insights, results, and predictions to improve decision making.

The digital skills gap

The rapid evolution of cloud technology and widespread adoption of cloud computing will require a workforce that has the right data and cloud skills, and across Asia, the supply of digitally skilled workers is nowhere near the demand. COVID-19 accelerated the adoption of cloud tech which meant the skills gap widened as the global talent landscape transformed. Digital workers in Asia today know they will need advanced digital skills – almost half believe cloud computing skills will be required in their jobs within just four years.

pexels thisisengineering 3861958
Photo by ThisIsEngineering from Pexels

Broadening the skills base of workers globally is vital for economic growth, resiliency, and prosperity, and the social implications of failing to act include rising income disparity and more unemployment. Since COVID-19, there has been mass labour market displacement with job losses predicted to far exceed the Global Financial Crisis, and unemployment is forecasted to be at its highest since the Great Depression. With this in mind, governments around the world are implementing national policies on skilling and laying the building blocks for reforms, but more needs to be done by the private sector. Employers need to help current workers upskill, educational institutes need to adopt curricula that provide relevant skills, and workers across all fields need to seize the opportunity to learn new digital skills.

AWS is invested in the future

AWS is committed to a dynamic and entrepreneurial IT sector and supporting economic growth globally, and we hope to build resilience into the digital-skilled workforce and help bridge the skills gap. Globally, we are committed to helping 29 million people grow their technical skills with free cloud computing training by 2025. We have made over 500 free, on-demand, courses available online, with many courses available in local languages such as Bahasa Indonesia, Japanese, Korean, Simplified and Traditional Chinese, as well as interactive labs and virtual day-long training sessions through AWS Training and Certification. We are also working with educational institutes around the region to develop programmes that provide students with relevant in-demand cloud tech skills.

The world’s workforce needs a sustainable future, and Amazon is committed to helping provide this by making more than 91 renewable energy investments around the world and committing to Amazon’s Climate Pledge to be a net-zero carbon business by 2040, 10 years ahead of the Paris Agreement, and to be on 100% renewable energy by 2025.

The cloud has the power to do a lot of good, but we must be prepared to harness that power with a skilled workforce that can meet the challenge to innovate at exponential speed. As the world emerges from the COVID-19 pandemic with new ways of operating, working, and living being adopted, cloud will remain at the forefront of our digital lives.

Keeping Up with the Pace of Innovation with the Cloud

When I was a young boy growing up in Jersey in the British Channel Islands, I’d turn on the grainy TV to warm up so I could watch sports with my father and brother. FORMULA 1 racing was the most exciting sport for us, even though the cars often sped by faster than the camera operator and the technology could keep up.

Now, racing is covered in a far richer and more engaging way, especially since F1 launched F1 Insights powered by AWS in 2018, bringing data analytics as a live feed to my screens. Watching on my phone in Singapore, I love the real-time Car Performance Scores, which include thousands of data points streamed every second from every car on the track, giving me a much better understanding of where my favorite car ranks in the field – and what’s driving its performance.

time lapse photography of brown concrete building
Photo by zhang kaiyv on Pexels.com

It’s exactly this type of real-time information that businesses need to understand their performance, so they can make decisions rapidly and keep up with market changes. During the pandemic, we have learned that speed matters, whether you’re a digital native or a more traditional organization. As all businesses faced social distancing measures, those who survived the pandemic adopted new ways to do business, and they adapted fast using the cloud.

Some moved faster than others. Some enterprises with legacy systems seem resigned to moving slowly. Even today, I often hear comments like, “It’s just the nature of our size and heritage.”

We must debunk that myth. Speed is not preordained by heritage. Speed is a choice that any organization can make if it is prepared to harness the cloud. As a recent McKinsey article put it: “For CEOs, cloud adoption is not just an engine for revenue growth and efficiency. The cloud’s speed, scale, innovation, and productivity benefits are essential to the pursuit of broader digital business opportunities, now and well into the future.”

Culture Change

Many organizations can look for ways to change their culture and embrace speed, creating an environment that values urgency. In a culture designed for speed, people are actively encouraged to experiment and are rewarded for it. Although, flipping a switch won’t suddenly deliver speed – companies have to build muscle while they learn how to innovate at pace, all the time.

Amazon has been around for nearly 27 years, and to this day we maintain what we call a “Day 1” culture – approaching everything we do with the entrepreneurial spirit of being on the first day of your organization. We do this by giving our teams autonomy, on the understanding that they operate within the guardrails of our culture.

group of people sitting indoors
Photo by fauxels on Pexels.com

We believe the more we can equip people to make high judgment decisions at all levels, the better off we, and our customers, are. We encourage employees to make high-velocity, high-quality decisions by setting the vision and context for teams. Since Amazon was founded in 1994, we’ve consistently operated based on three big ideas that every employee knows. The first is to obsess over customers. This is cemented in our mission statement to be “earth’s most customer-centric company.” The second is that if we focus on the customer it will force us to innovate – to look at new ways of solving problems on behalf of our customers. The third is to be stubborn in sustaining our long-term vision while being flexible in how we get there.

As Jeff Bezos explains, “In a traditional corporate hierarchy, a junior executive comes up with a new idea that they want to try. They have to convince their boss, their boss’s boss, their boss’s boss’s boss and so on – any ‘no’ in that chain can kill the whole idea.” Systems and processes that identify, validate, and approve new ideas from within the business are invaluable in democratizing company-wide idea exploration and driving experimentation in business as usual. For example, at Amazon, we make it easy for those closest to our customers to raise ideas for speedy review. Imagine a time-wasting process or one that results in a poor customer experience. People complain about it regularly, but they know that it can be so hard to implement change, that it’s not worth the effort. The problem is put in the “too hard” basket and no one says anything. Now, imagine actually rewarding teams for suggesting a fix. Imagine if the process was fast and painless and resulted in change. How many great ideas would happen every week?

Thinking Big and Acting Small

Thinking big is the hallmark of innovation. But, as we look to move quickly and embrace greater experimentation, we should also look to de-risk the process. This means recognizing that the most powerful innovations often come through simplification. One small, seemingly insignificant cost or time-saving can drive enormous benefits for both companies and their customers when applied at scale. Thinking big also means starting big ideas with very small, reversible experiments. At Amazon, we look for “two-way doors.” If an experiment fails (as they often do), we can back out of the decision rather than being committed to moving ahead through a “one-way door,” which can be expensive and difficult to undo. This way, you learn quickly with very low stakes.

StockSnap DNA7ILF8DA
Photo by Burst on StockSnap

A great example of innovative thinking in the face of legacy technology is FashionValet. As the modest fashion brand grew, its multi-environment hybrid technology infrastructure was unable to keep up with demand during product launches. In 2019, FashionValet went all-in on AWS to optimize processes and meet growing demand. With Auto Scaling Groups and RDS Aurora features, FashionValet can now run 10x more servers during product launches to meet demand, then scale down automatically with no downtime. Using this technology, FashionValet has also accelerated their product development timeline by 200% and reduced their infrastructure management costs by 75%.

Companies don’t have to bet their business on innovation, but they shouldn’t let legacy thinking hold them back. By actively empowering teams, clearing the path to “Yes,” and using small experiments, companies can build capability to promote high-velocity decisions – helping them operate at the speed of F1.

5G, Industry, & Collaboration at the Edge

Edge computing is the ability to give life to the transformative use cases that businesses are dreaming up today and bring real-time decision making to last-mile locales. This can include a far-flung factory or train roaring down the tracks, someone’s connected home, or their car speeding down the highway or even in space. Who thought we’d be running Kubernetes in space?

This shows that edge computing can transform the way we live, and we are doing it right now.

Why Collaboration Is Critical

Edge technologies are blending the digital and physical worlds in a new way, and that combination is resonating at a human level. This human resonance might sound like an aspirational achievement, but it is already here. A great example is when we used AR/VR to improve safety on the factory floor.

Continued collaboration, however, is necessary to keep enabling breakthrough successes. Across industries and organizations, we are all highly dependent on one another. Thinking about the telecommunications and industrial sectors, in particular, there is a mutually supportive, symbiotic relationship between these industries—5G development cannot be successful without industrial use cases, which, in turn, are based on telco technologies.

person writing on the notebook
Photo by Startup Stock Photos on Pexels.com

However, numerous challenges remain: reducing network complexity, maintaining security, improving agility, and ensuring a vibrant ecosystem where the only way to address and solve those is by tapping into the collective wisdom of the community.

With open-source, we can unify and empower communities on a broad scale. The open-source ecosystem brings people together to focus on a common problem to solve with software. That shared purpose can turn isolated efforts into collective ones so that changes are industry-wide and reflect a wide range of needs and values.

The collaboration that open source makes possible continues to ignite tremendous change and alter our future in so many ways, making it the innovation engine for industries.

If we collaborate on 5G and edge in this manner, nascent technologies could become exciting common foundations in the same way that Linux and Kubernetes have because when we work together, the only limit to these possibilities is our imagination.

From Maps to Apps and Much More

Do you remember having to use a paper-based map to figure out driving directions?  Flash forward to today: Look at the applications we take for granted on our phones or in our homes that allow us to change our driving route in real-time to avoid traffic, or to monitor and grant access to our front doors—to the point that these have shaped how we interact with our environments and each other. Yet not too long ago, many of these things were unimaginable. We barely had cloud technology, we were in the transition from 3G to 4G, and smartphones were new.

holding a smartphone
Photo by cottonbro on Pexels.com

But there was important work being done by lots of people who were improving upon the core technologies. The convergence of three technology trends, as it turns out, unlocked a hugely disruptive opportunity: a cloud-native, mobile-device-enabled transportation service that picked you up wherever you were and took you wherever you wanted to go.

This opportunity was only possible because each trend built on the others to create a truly novel offering. Without one of these trends, the applications from the ride-sharing apps of the world would not have been the same or as disruptive. Imagine yourself scrambling to find a WiFi hotspot on the street corner, whipping out your laptop outside a restaurant while standing in the rain, or starting your business by first constructing a massive data centre. The convergence of smartphones, 4G networks, and cloud computing has enabled a new world.

Today we are creating the next set of technologies that will become the things so embedded in our lives and so indispensable to our daily habits that we will wonder how we ever got by without them. Are you ready to be wearing clothes with sensors in them that tell you how healthy you are?

The possibilities with edge technologies are equally as exciting. It starts with the marriage of the digital world with the physical world. Adding in pervasive connectivity—leveraging a common 5G and edge platform—we can transform how operational technologies interact with the physical world and that changes everything.

The Future Is Now

We are creating this new world that is hard to imagine, yet it is not so foreign because we have seen how this story has played out before. Expect these new technologies to have profound implications for humanity—in our daily lives, how we interact with one another, and the social fabric of our world.

high angle photo of robot
Photo by Alex Knight on Pexels.com

All of that cannot happen without collaboration.

We have only to look at how open source has empowered collaboration and how working together has helped people across organizations and industries build more robust, shared platforms more quickly and differentiate on top of them—with apps and capabilities built on the foundation of Kubernetes and Linux, for example.

Vigilance is Crucial for Businesses in Dealing with Modern Malware

In just the first four months of 2021, Trend Micro’s Research team detected 113,010 ransomware threats in Malaysia. Ever since the first detected case of ransomware infection in 2005 globally[1], ransomware has evolved. Over the years, ransomware has evolved and has resulted in the emergence of what is often termed modern ransomware; which is even more targeted and malicious in nature.

The recent attack on enterprise technology firm Kaseya[2], where hackers demanded US$70 million (RM290.92 million) worth of bitcoin in return for stolen data, is a stark reminder of the sweeping damage and disruption that modern ransomware is capable of. 

crop hacker typing on laptop with information on screen
Photo by Sora Shimazaki on Pexels.com

Traditionally, ransomware attacks were conducted through a “click-on-the-link” that leads to compromised websites or spam emails. This was typically aimed at a random list of victims to collect moderate pay-out.

Today, threat actors have evolved their strategies to inflict greater damage on a company’s reputation and potentially collect larger pay-outs from high-profile victims. This is what is becoming known as a “double-extortion” strategy in modern ransomware attacks. According to Trend Micro’s research[3], criminals take these steps to personalize the attacks:

  1. Organize alternative access to a victim’s network such as through a supply chain attack;
  2. Determine the most valuable assets and processes that could potentially yield the highest possible ransom amount for each victim;
  3. Take control of valuable assets, recovery procedures, and backups;
  4. Steal and threaten to expose confidential data;

In Malaysia, Trend Micro found that the industries most targeted by ransomware are government, healthcare, and manufacturing[4]. As these sectors continue to play a role in driving economic growth in the country, it is clear that a multi-layered cybersecurity defence system is necessary. These enterprises will need to create such a defence to defend their networks and protect their business-critical data to keep up with the ever-evolving ransomware landscape.

close up view of system hacking
Photo by Tima Miroshnichenko on Pexels.com

In order to keep up with the ever-evolving ransomware landscape, among the three most important must-dos for Malaysian organizations are: 

  • Maintain IT hygiene factors: Security teams should ensure that proactive countermeasures, such as monitoring features, backups, and trainings in security skills, are in place to enable early detection. Alongside that, everyone in an organization should also have the latest security updates and patches installed.
  • Work with the right security partners: Start by clearly defining the needs and priorities around enterprise security in an organization. Then, collaborate with a security vendor that aligns with these priorities to create a solid security response playbook to be used on an ongoing basis.
  • Have visibility over all security layers: In order for security teams to be able to detect suspicious activity early-on and to respond to cyber attacks quicker, organizations should utilize tools such as Trend Micro Vision One, which collects and automatically correlates data across email, endpoints, servers, cloud workloads, and networks. By putting the right technologies in place, enterprises can also help reduce the alert fatigue commonly faced by security operations centers (SOCs), with 54% reporting that they are overwhelmed by alerts[5].

In today’s world of constant attacks, cybersecurity should be a top priority for everyone across the entire organization; and not just be the sole responsibility of the security team. While an organization can eventually recover its data or financial resources post-attack, the loss of trust among customers and partners will be a difficult challenge to remedy. All stakeholders must collaborate, invest in proper resources, and take proactive steps to transform workplace culture and best practices in order to stop pernicious ransomware threats at the door. 


[1] Trend Micro, Ransomware, https://www.trendmicro.com/vinfo/us/security/definition/ransomware

[2] Trend Micro, IT Management Platform Kaseya Hit With Sodinokibi/REvil Ransomware Attack, 4 July 2021. https://www.trendmicro.com/en_my/research/21/g/it-management-platform-kaseya-hit-with-sodinokibi-revil-ransomwa.html

[3] Trend Micro, Modern Ransomware’s Double Extortion Tactics, 8 June 2021. https://www.trendmicro.com/vinfo/gb/security/news/cybercrime-and-digital-threats/modern-ransomwares-double-extortion-tactics-and-how-to-protect-enterprises-against-them

[4] Trend Micro, Trend Micro 2020 Annual Cybersecurity Report, 23 February 2021. https://www.trendmicro.com/vinfo/us/security/research-and-analysis/threat-reports/roundup/a-constant-state-of-flux-trend-micro-2020-annual-cybersecurity-report

[5] Trend Micro, 70% Of SOC Teams Emotionally Overwhelmed By Security Alert Volume, 25 May 2021, https://newsroom.trendmicro.com/2021-05-25-70-Of-SOC-Teams-Emotionally-Overwhelmed-By-Security-Alert-Volume

Resilience in the wake of 2020: Red Hat’s path in 2021

2020’s gone and it won’t be missed. For all of the chaos, confusion and change the previous year brought, it helped illuminate a critical facet of Red Hat, our associates, our partners, our customers and our communities. It showed that we are resilient. Not only did we weather it as a company, we helped those around us stand firm through the storm. That’s something to be proud of, and I know that as CEO of Red Hat, I’m thankful at how we as a business, as a pillar of the open source community and as a global organization kept a steady hand throughout. 

Red Hat was born out of community. It’s at the center of everything we do. When faced with uncertainty and when we see others in need, that’s when we pull together and show our mettle. Throughout the past year, Red Hatters showed a tremendous capacity for fortitude and humanity. When I first took over the role of CEO, I made the comment that I wanted every Red Hatter who was here at that point to still be here in a year. And I think we’ve held true to that. 

view of cityscape
Photo by Aleksandar Pasaric on Pexels.com

At the time, that conversation centered on finding work-life balance when the lines became blurred. Without taking care of our personal lives and mental health, we’re not able to meet the needs of our customers. As associates became school teachers and caretakers, dealt with drastically reduced social interactions and grieved the loss of normalcy, they still served customers and helped them be successful. We didn’t just hunker down and wait for the storm to pass; we still moved forward and made ourselves available to help others.

No time to slow down

While the COVID-19 pandemic stalled many industries, the software industry raced forward. Technologies like cloud computing and automation became more important than ever. They are now firmly in the category of must-have, instead of nice-to-have. As a company, we turned our attention to products and services that our customers need to support remote work, expand digital services, scale to meet demand, become more resilient and keep innovating. I attribute our ability to continue to show strong growth throughout the year to this strategy and I’m so proud of the team for keeping the momentum going. 

With our biggest announcements last year, you’ll no doubt sense a theme – making sure that our customers can develop and deploy any app, anywhere. They want the choice and flexibility to use the innovations and technologies on a platform that makes sense for the job at hand, and we’re making sure they can do just that. Red Hat OpenShift is the industry’s leading enterprise Kubernetes platform and highlights a future where containers and virtualization, managed consistently across the open hybrid cloud, are helping customers maintain operations while still bringing new products and services to market faster. 

time lapse photography of city road at nighttime
Photo by zhang kaiyv on Pexels.com

We introduced Red Hat Advanced Cluster Management for Kubernetes, a new management solution designed to help organizations exert more consistent control over their Kubernetes clusters across the hybrid cloud — from bare-metal to major public cloud providers and everything in between. 

Once they can deploy anywhere, they need to be able to bring those mixed workloads together and that’s where OpenShift Virtualization comes in. An integrated component of Red Hat OpenShift, we’re giving customers the ability to manage traditional workloads alongside cloud-native services, letting them prepare for the future while retaining existing investments. This helps to break down technology silos that can slow innovation and impact the customer experience. 

For those wanting an increased level of support from us, OpenShift Dedicated is a fully managed service of Red Hat OpenShift on AWS, Google Cloud Platform and Microsoft Azure. We continue to enhance and refine the capabilities of this managed offering, providing an option for organizations looking to reduce the operational complexity of infrastructure management, but still get all the benefits of enterprise Kubernetes. This enables their IT teams to focus on building and scaling the next-generation of applications, rather than keeping infrastructure lit up.

One of the benefits of open source is our close connection to the innovation born in open source communities, where new ideas and concepts emerge and incubate. This is a direct link to IT’s future, enabling us to more readily see trends as they evolve. It’s this connection that enabled us to push the envelope in open hybrid cloud computing, and it’s now providing our launchpad for the next wave: edge computing. Edge brings its own challenges for administrators and developers alike, so we’ve delivered new capabilities for Red Hat Enterprise Linux and Red Hat OpenShift to help bring edge computing into hybrid cloud deployments. 

Coming together

The channel is what made Red Hat. Without our partner ecosystem, Red Hat would be a very different company. We have been successful because of our independence and our work across a broad spectrum of cloud and service providers, including Amazon, Google, IBM and Microsoft. As the saying goes: “actions speak louder than words.” Our neutrality is something that can’t change and you can see it in some of the moves we made this year. 

Red Hat and Microsoft have been working to co-develop hybrid cloud solutions for years, which ultimately led to Azure Red Hat OpenShift, the industry’s first jointly-engineered, managed and supported OpenShift service on a leading public cloud. This year we continued our drive as a leading enterprise Kubernetes service on the public cloud with Azure Red Hat OpenShift on OpenShift 4, bringing the power of Kubernetes Operators to Azure along with the flexibility of Red Hat Enterprise Linux CoreOS.

photo of people near wooden table
Photo by fauxels on Pexels.com

As I’ve said, open source is about choice and about meeting customers where they are, on whichever cloud platform they prefer. With that in mind, we continued our work across the public cloud withRed Hat OpenShift Service on AWS, a jointly-managed and jointly-supported enterprise Kubernetes service on AWS. Red Hat OpenShift is now the common Kubernetes denominator on two of the world’s largest clouds but, most importantly, it’s now easier for our customers to consume OpenShift where it makes most sense for them without sacrificing operational flexibility or service levels. 

We’re also seeing the promise of our acquisition by IBM come to fruition, as we scale and work together for powerful world-spanning solutions. Schlumberger represents one of these moments. By collaborating with IBM, this initiative will support its business and provide Schlumberger’s associates global access to its leading exploration and production cloud-based environment and cognitive applications by using IBM’s hybrid cloud technology, built on Red Hat OpenShift. 

On the horizon

Just a month in and we’ve already set the tone for the year. All roads, whether it’s through edge computing, serverless or Kubernetes, lead to open hybrid cloud. That’s what we’ve worked to build and where our focus continues to be. We’ve been talking about it for nearly a decade because it’s not just another trend; it’s an enterprise imperative. It’s through the hybrid cloud that we help our customers solve dynamic challenges and keep Red Hat in innovation’s vanguard.

We announced our intent to acquire StackRox, a leader and innovator in Kubernetes-native security. Once the transaction closes, this move will allow us to enhance security for cloud-native workloads by expanding and refining the Kubernetes’ native controls already present in OpenShift while shifting security into the container build and CI/CD phase. 

person sitting on rock on body of water
Photo by Keegan Houser on Pexels.com

Having a seamless integration between our sales and services strategy and our technology vision is critical to our success, and it calls for the right leader. For nearly a decade, Arun Oberoi has led the team and transformed our go-to-market approach matching our expanding open hybrid cloud portfolio, through strategic acquisitions and new alliances. He will retire later this year and Larry Stack will step into the role of executive vice president of Global Sales and Services. What I appreciate most about him is that he embraces the Red Hat culture and the customer is always the focus. There is a huge opportunity in front of us, as we keep scaling, Larry’s strong experience and the strategic thinking that he brings are going to help us capitalize on it.

Just because we made it out of 2020, doesn’t mean we’re back to business as usual. The pandemic is still impacting the world and organizations are still feeling the effects. The challenges aren’t going away, but we’ve shown resilience and that needs to be a trait that we keep as we move through the year. While 2021 holds many unknowns, one thing that is not unknown is our path forward. 

AWS Committed to Accelerating Malaysia’s Digital Transformation

Amazon Web Services (AWS) has been a part of Malaysia’s digital transformation journey since 2015 when we established our presence in the country with a local marketing entity, AWS Malaysia Sdn. Bhd. The announcement by the Malaysian Government outlining the Malaysia Digital Economy Blueprint via MyDigital marks a milestone in the nation’s journey to transform to a digital economy, built on cloud computing. As part of this journey, AWS is delighted to be named as a Cloud Service Provider for the Government of Malaysia by the Malaysian Administrative Modernisation and Management Planning Unit (MAMPU).

With the right technology, governments, nonprofits, economic development organisations, and other entities can improve their internal operations, become more productive and, ultimately, focus more acutely on serving citizens. This can support business growth and help citizens enjoy improved quality of life. As organisations increasingly embrace cloud-based solutions, long-lasting effects can be realised in the form of community-wide collaboration, partnerships with local businesses, and increased innovation. These organisations can in turn wield greater influence on economic development and growth. The cloud also provides affordable IT services for entrepreneurs, helping them start and scale companies quicker and more reliably. These efforts pave the way toward building new businesses and a more productive workforce, which boosts local economic development.

Photo by Su San Lee on Unsplash

AWS works closely with state governments, education institutions, and not-for-profit organisations in Malaysia to accelerate innovation, increase agility, and drive cost savings through the cloud. Our Malaysian customers range from public sector entities such as Smart Selangor Delivery Unit (SSDU), Asia Pacific University, and other government agencies, to enterprises including Petronas, Maxis, Astro, and Boost (Axiata), to startups like StoreHub, FashionValet, and 123RF.

SSDU leverages cloud computing to transform government services

SSDU started working with AWS in 2018 when they first built a Citizens Electronic Payments Platform for Malaysians to access paid government services through a highly scalable and reliable central mobile and web portal on AWS. Using the centralised platform, SSDU was able to conduct data analysis in identifying trends with near real-time data generated across the entire Selangor state to forecast and optimise services, accelerating the development of new solutions without large upfront investments. At the beginning of the COVID-19 pandemic, SSDU rolled out an operation dashboard on AWS that provided critical data for making real-time decisions to enforce containment measures. Simultaneously, SSDU facilitated the shift of over 1,000 local traders to sell products from physical to online stores to enable business continuity.

Using AWS, SSDU gained the control and confidence they needed to securely run their platform with the most flexible and secure cloud computing environment available today. As an AWS customer, SSDU benefits from AWS data centres and a network architected to protect all customer information, identities, applications, and devices. With AWS, SSDU has improved its ability to meet core security and compliance requirements, such as data locality, protection, and confidentiality using AWS’s comprehensive services and features. We look forward to partnering with more government agencies, empowering them to transform their digital service offerings for citizens.

Optimising the educational experience with cloud

Education institutions, like Asia Pacific University in Malaysia, have gone all-in on AWS, moving their entire technology infrastructure to AWS in order to transform the teaching and learning experience. They are running a mobile application for a cashless campus, deploying IoT services for their student attendance and queue systems, and using artificial intelligence and machine learning (AI/ML) for part of its learning environment. Asia Pacific University is delivering education resources to students 116 times faster than when they were using on-premises infrastructure, vastly improving students’ user experience. 

Upskilling the next generation of cloud talent

Additionally, AWS Educate programme provides students and educators with resources and content that focus on building cloud skills in education institutions in Malaysia such as Universiti Malaya (UM), Universiti Kebangsaan Malaysia (UKM), Universiti Putra Malaysia (UPM), Universiti Sains Malaysia (USM), Universiti Teknologi MARA (UiTM), and Asia Pacific University of Technology & Innovation. In August-September 2020, AWS held the ASEAN DeepRacer Women’s League to encourage young women in higher education institutions to acquire skills in artificial intelligence (AI) and machine learning (ML), and discover how technology can be used to create innovations to solve real-world problems. Two Malaysian students attained the first and second runner-up in the ASEAN League. 

young lady using laptop at table in modern workspace
Photo by Vlada Karpovich on Pexels.com

AWS has also been nurturing the local startups in Malaysia, and providing cloud skilling to develop the future workforce. Through AWS startup programmes, we have helped reduce cost of experimentation and accelerate innovation with AWS promotional credits. For example, the AWS Trusted Advisor online tool helps startup customers reduce costs, increase performance, and improve security by checking their use of AWS services, and making suggestions to help optimise performance. We further offer a variety of free online education and livestreamed or on-demand training events for startups across Asia at every growth stage, including AWSome Days and the AWS Builders Online Series for early-stage founders new to the cloud. To date, AWS has supported and helped grow hundreds of startups that are headquartered in Malaysia.

AWS is deeply committed to providing Malaysia with the best-in-class cloud technology. We look forward to building upon our worldwide experience in working with over 7,500 government agencies, more than 14,000 academic institutions and over 35,000 nonprofit organisations, alongside millions of active customers across other vertical industries around the world, to support the Government of Malaysia on its digital transformation journey.

A Deep Dive into One UI’s Design

There isn’t much you can’t do with a smartphone these days, from filming content to banking online from anywhere with just a few taps. But there is always room for improvement, and Samsung Electronics is constantly seeking to make the smartphone experience more intuitive to help us do even more. This is why One UI was created.

Unveiled in November 2018, One UI improved smartphone usability for millions of users. After two years of further evolution, Samsung launched One UI 3 in December 2020, building on the design, efficiency, and user experience of the original across various devices. Now Samsung is raising the bar yet again with One UI 3.1. From February 18th, the latest One UI is bringing updates to support powerful functionality for some existing smartphones.[1]

So what kind of experience can users enjoy with One UI 3? Samsung Mobile Press sat down with the designers of One UI to ask them what we can expect.

The 4 Principles of One UI

1) Focus on the task at hand
2) Interact naturally
3) Be comfortable to view
4) Make things responsive

These four principles were established by One UI designers to give users the best experience possible.

▲ Jeonggun Choi, Principal UX Designer in the Core UX Group, Mobile Communications Business, Samsung Electronics

The fourth principle is a completely new principal introduced for One UI 3.1.

“From tablets to foldable phones and regular smartphones, the types of devices people are using has diversified, and the number of features and functions has also increased,” said Principal UX Designer Jeonggun Choi. “Following this trend, new principles were needed to provide the best layout for our users.” Whether an app is running on the Galaxy S21, Galaxy Tab, or the foldable Galaxy Z series, the UI is optimized for each device.

When using the Samsung Notes app on smartphones with regular sized displays, for example, users can access the app menu by pressing the navigation button at the top left side of the screen. But on the Galaxy Z Fold and Tab series, users can take advantage of the larger display by having the entire menu always in view, without having to press anything.

▲ Before (left) and after (right) views of the high contrast menu function

As part of the fourth principle, Samsung has also improved the accessibility experience by recommending features that complement the ones already in use. When someone with impaired or weakened vision has High contrast fonts turned on, for example, One UI suggests other features that improve visibility such as Bold font or Dark mode on the Recommended for you screen. One UI also reduces the hassle of having to sort through several menus by allowing users to turn off any accessibility features they are using from a single screen.

“Accessibility features are so diverse that it can often be difficult to use them to their full potential,” said Jeonggun Choi. “The ‘Recommended for you’ function increases convenience and helps users get more out of those features by identifying and recommending ones that users may need. We’ve also made it easy to use these features only when needed by allowing users to turn the features on and off while in use on one page.”

Galaxy Ecosystem Allows for Seamless Connectivity Between Devices

Another perk offered by One UI 3 is the ability to seamlessly switch between smartphones, tablets, and laptops. Taking into account the increased use of tablets and laptops spurred by a boom in remote learning and work, the new update provides a ‘connected device experience.’ This seamless ecosystem is paramount to enhancing study and work productivity.

“A new feature called ‘Continue apps on other devices’[2] has been added so that users can continue whatever they were doing on their smartphone – whether browsing a web page or working on a draft in Samsung Notes – on their tablet,” said Principal UX Designer Min-Young Chang. “Users can also copy text on their smartphone and paste it onto their tablet.”

In addition, users can connect their Book Cover Keyboard to both their tablet and smartphone with the Wireless keyboard sharing feature. The new Auto Switch feature also automatically connects Galaxy Buds to whichever device is playing media, so that users can seamlessly switch between their smartphones and tablets.

Connected device experiences are not only available with mobile devices, they are also available on home appliances such as TVs. Starting with One UI 3, users can use Smart View to enjoy multimedia content from their smartphone on their TV alongside the camera feed from their smartphone. This is especially beneficial for users who work out at home, allowing them to compare their movements with those of their virtual instructor.

“Starting with the One UI 3.1 update, users can cast their Google Duo video calls onto their TV with one just click,” said Jeonggun Choi. “With an increasing number of people connecting with their family via video calls and conducting virtual work meetings, this is an especially useful feature.”

A Customizable Galaxy Experience to Suit Your Needs

Smartphones are no longer just a tool that gives users the power to do various tasks—they’ve become a means of self-expression. Perhaps the best example is decorating the Galaxy Z Flip with stickers to create your own unique phone. But there are many ways to create a custom smartphone experience with the latest updates. “We have implemented a diverse array of features to let users use their smartphone as a form of self-expression,” said Jeonggun Choi. “Users can enjoy a customized Galaxy experience by choosing a video as the incoming and outgoing call screen or changing the wallpaper in the Messages app.”

▲ Before (left) and after (right) views of the quick panel. The design has been simplified and the most commonly used icons have been placed to speed up the search process. Users can add other icons by clicking the ‘See more’ button in the top right corner.

“We first analyzed the usage for each feature available in previous models,” said Min-Young Chang. “After ranking the features based on their popularity, we placed the most used features at the top while hiding the least used features to simplify the panel.”

Saving Your Time, Even if it’s Only 1 Second

▲ Min-Young Chang, Principal UX Designer in the UX Design 2 Group, Mobile Communications Business, Samsung Electronics

Another updated feature available through One UI 3.1 is the integration of the Clock app with Digital Wellbeing’s Bedtime mode.[3] After opening the Clock app, users can tap See More, then tap Set bedtime to set their sleep and wake-up times. Users no longer have to switch between two apps, making setting a daily sleep schedule easier and faster.

“Users might not be aware of all of the One UI design updates that have been made but these improvements combined help users recognize and react to various features quicker and in a more direct manner. Even if it saves 1 to 2 seconds of user’s time, I think it would have been worth the effort,” said Jeonggun Choi. “The One UI designers are going to continue coming up with designs that elevate our users’ happiness and satisfaction with Galaxy devices.”


[1] Features of One UI 3.1 may vary by device model. Updates may vary by carrier, country, and model.
[2] Both connected devices need to be running One UI 3.1 or above, have Bluetooth turned on, be signed into the same Samsung account, and be connected to the same Wi-Fi network.
[3] The feature allows users to change the background to grayscale and mute incoming notifications in order to help them sleep better.

We’re in the Golden Age of Machine Learning, Tomorrow it will be Ubiquitous – Four Things We Need to Do Now

Today, thanks in large part to the cloud, actions such as communicating over text or transferring funds digitally are so commonplace, we hardly even think about how incredible these processes are; as we enter the golden age of machine learning, we can expect a similar boom of benefits that previously seemed impossible.

Machine learning is already helping companies make better and faster decisions. In healthcare, the use of predictive models created with machine learning is accelerating research and discovery of new drugs and treatment regiments. In other industries, it’s helping remote villages of Southeast Africa gain access to financial services, and matching individuals experiencing homelessness with housing.

While the short term applications are encouraging, machine learning could potentially have an even greater impact on our society. In the future, machine learning will be intertwined and under the hood of almost every application, business process, and end-user experience. However, before this technology becomes so ubiquitous that it’s almost boring, there are four key barriers to adoption we need to clear first.

Democratizing machine learning

The only way that machine learning will truly scale is if we as an industry make it easier for everyone – regardless of their skill level or resources – to be able to incorporate this sophisticated technology into applications and business processes.

green and white lights
Photo by cottonbro on Pexels.com

To achieve this, companies should take advantage of tools that have intelligence directly built into applications that their entire organization can benefit from. For instance, 123RF, a homegrown stock photography portal, aims to make design smarter, faster, and easier for users. To do so, it relies on Amazon Athena, Amazon Kinesis, and AWS Lambda for data pipeline processing. Its newer product Designs.ai Videomaker uses Amazon Polly to create voice-overs in more than 10 different languages. With AWS, 123RF has maintained flexibility in scaling its infrastructure and shortened product development cycles and is looking to incorporate other services to support its machine learning & AI research.

As processes go from being manual to automatic, workers are free to innovate and invent, and companies are empowered to be proactive instead of reactive. And as this technology becomes more intuitive and accessible, it can be applied to nearly every problem imaginable–from the toughest challenges in the IT department, to the biggest environmental issues in the world.

Upskilling workers

According to the World Economic Forum, the growth of AI could create 58 million net new jobs in the next few years. However, research suggests that there are currently only 300,000 AI engineers worldwide, and AI-related job postings are three times that of job searches with a widening divergence. Given this significant gap, organizations need to recognize that they simply aren’t going to be able to hire all the data scientists they need as they continue to implement machine learning into their work. Moreover, this pace of innovation will open doors and ultimately create jobs we can’t even begin to imagine today.

That’s why companies in the region like Asia Pacific University, DBS, Halodoc and others are finding innovative ways to encourage and nurture more young talents to gain new machine learning skills in fun, interactive hands-on ways, such as the AWS DeepRacer League. It’s critical that organizations should not only direct their efforts towards training the workforce they have with machine learning skills, but also invest in training programs that develop these important skills in the workforce of tomorrow.

Instilling trust in products

With anything new, often people are of two minds – either an emerging technology is a panacea and global savior, or it is a destructive force with cataclysmic tendencies. The reality is more often than not, a nuance somewhere in the middle. These disparate perspectives can be reconciled with information, transparency, and trust.

Photo by Arseny Togulev on Unsplash

As a first step, leaders in the industry need to help companies and communities learn about machine learning, how it works, where it can be applied, ways to use it responsibly, and understand what it is not.

Second, in order to gain faith in machine learning products, they need to be built by diverse groups of people across gender, race, age, national origin, sexual orientation, disability, culture, and education. We will all benefit from individuals who bring varying backgrounds, ideas, and points of view to inventing new machine learning products.

Third, machine learning services should be rigorously tested, measuring accuracy against third party benchmarks. Benchmarks should be established by academia, as well as governments, and be applied to any machine learning-based service, creating a rubric for reliable results, as well as contextualizing results for use cases.

Regulation of machine learning

Finally, as a society, we need to agree on what parameters should be put in place governing how and when machine learning can be used. With any new technology, there has to be a balance in protecting civil rights while also allowing for continued innovation and practical application of the technology.

small judge gavel placed on table near folders
Photo by Sora Shimazaki on Pexels.com

Any organization working with machine learning technology should be engaging customers, researchers, academics, and others to best determine the benefits of its machine learning technology with the potential risks. And they should be in active conversation with policymakers, supporting legislation, and creating their own guidelines for the responsible use of machine learning technology. Transparency, open dialogue, and constant evaluation must always be prioritized to ensure that machine learning is applied appropriately and is continuously enhanced.

What’s next

Through machine learning we’ve already accomplished so much, and yet, it’s still day one (and we haven’t even had a cup coffee yet!). If we’re using machine learning to help endangered orangutans, just imagine how it could be used to help save and preserve our oceans and marine life. If we’re using this technology to create digital snapshots of the planet’s forests in real-time, imagine how it could be used to predict and prevent forest fires. If machine learning can be used to help connect small-holder farmers to the people and resources they need to achieve their economic potential, imagine how it could help end world hunger.

To achieve this reality, we as an industry, have a lot of work ahead of us. I’m incredibly optimistic that machine learning will help us solve some of the world’s toughest challenges and create amazing end-user experiences we’ve never even dreamt. Before we know it, machine learning will be as familiar as reaching for our phones.