Contact Us
Hughes

What We Learned About the Edge and Edge Computing from Innovation Day

Share
false
Innovation_Day_P2020

Hughes and Plug N Play hosted a half-day event called Innovation Day that focused on edge computing and the various technological threads that it weaves together. As you’ve read in my previous posts, there are three key stakeholders when it comes to the evolution of the edge computing ecosystem—‍and an ecosystem it is. This event brought together all three of the stakeholders: enterprises who see the value of edge computing, application and infrastructure development companies who deliver the business intelligence, and orchestrators who ultimately manage and serve these brand-new operating environments at the edge.

A keynote that set the intellectual stage for the day and brought everyone to a common understanding about the state of technologies and the marketI urge you to watch the video of the event, which we’ve modularized for ease of viewing, but I want to share some insights that we gained from this remarkable day in this post. The event had three distinct parts:

  • An amazing panel of five experts and two market analysts that covered actual implementation use cases and challenges from both a business and technology standpoint
  • Pitches from nine cutting-edge startups that are leading the charge on technologies that have defined edge computing

Keynote speaker Matt Trifiro, Chief Marketing Officer at Vapor IO, stated that the “edge” is basically the edge of the last-mile network. Per Matt, “… the last‑mile network is where the infrastructure of the Internet meets the physical world, and that typically takes the form of a fiber to the home, fiber to a business facility or a factory, of a wireless signal to a phone, to a car; whether it's 4G, LTE, or 5G, and I'm sure in the future we'll invent other last mile network technologies…” During the event itself, this definition was further refined. The panelists—practitioners of AI and machine learning—brought to light the ways that they were using edge computing at the device level itself. For example, Dan Davies from North America Lighting spoke about using edge computing within the headlights of a car!

Edge computing is fundamentally an enabler for edge intelligence, and the decision to deploy edge intelligence means defining the business case to do so. The panelists discussed the different routes they took to further their efforts in bringing intelligence to their edge deployments. While some have made rapid improvements, for others, it has been an incremental process. Some panelists discussed the process of making AI and machine learning mainstream for strategic thinking and the challenges therein. The point is that they persevered, and now they are leading their industries in this endeavor to bring their technological goals to the forefront.

Seeing the startups was fantastic. They are the reason it’s even possible to imagine hosting intelligence at the edge. They have optimized machine learning models to fit any edge hardware, developed federated learning, selective intelligence, deep learning models, and databases custom built for distributed edge ecosystems. Their range of capabilities, their understanding of the emerging environments, and their academic and practical expertise makes them an essential part of any edge‑intelligent solution.

I look forward to your comments and collaborating with you all.