Manufacturing Goes Digital

A blueprint for the future rooted in lessons from the past

Nan Li |

It’s hard to imagine an industry as harsh and unforgiving to technological deployment as manufacturing. Remote facilities. Spotty internet connectivity. Limited (if existent) data streams that sit on bus protocols. High thermal, vibration, and electrical stress. When envisioning million-square-foot spaces, one is hard-pressed to dream up opportunities for the classic data + software toolkit used by Silicon Valley. And yet, the digitization of manufacturing (hereon out referred to as “digital manufacturing”) is one of the most interesting fronts for the deployment of technology today.

Why Now?
While receiving attention recently through press and industry support, digital manufacturing has been gaining traction behind the scenes throughout the last decade. Invisible to and independent of the tech community, a confluence of forces have contributed to the major tailwinds driving the industry today.

Forces within industry include:

Data Availability: Industrial equipment OEMs (original equipment manufacturers) are building sensing and connectivity directly into the machinery itself. With the installation of new sensors and new wireless connectivity capabilities across individual machines, and across subsystems in the assembly line, factories are becoming data rich.

Standardization/Interoperability: Industry standards are being set for data as it streams off of various components in the manufacturing value chain.

Pressures on Efficiency: Global rises in labor costs have created an increased focus to recapture margin through utilization improvement.

Global Consumption: Two billion people across the world are scheduled to join the consumer class in the next 10 years. This will put a tremendous amount of pressure on product companies to rethink their manufacturing and supply chain systems to service this demand.

Alongside industry movement, advances in technology driving these changes include:

Edge Intelligence & Decisions: Pushing cloud-derived algorithms/models out to the factory environment for computation with reduced bandwidth needs.

Computer Vision: This is now 100x better than 2005.

Adaptable Robotics: Robots are shifting from following pre-written lines of code dictating movement in 3/5/9 axes, towards sensing and reacting to the environment in real-time.

What Data Centers Can Teach Us About Opportunities in Manufacturing
We are at the beginning of a massive technology-driven transition within the manufacturing industry. Industry insiders have already started to dream up opportunities to to solve big problems leveraging sensors, data, analytics, and automation.

To best formulate a blueprint for the future products and companies that will be started in this industry, it might be prudent to reflect on a category with historical lessons relevant to manufacturing today.

During the dot-com boom of the 90’s, independently owned and maintained data centers were amassing groups of heterogenous machines that powered the early web. Clusters of servers assembled in large rooms were brought online overnight, fueled by consumer demand for the new web medium and venture capital dollars pushing for growth. These data centers were manned by armies of IT support staff on 24/7 duty to keep websites up and running.

These clusters carried with them a startling amount of fragility and interdependency, and when (not if) they went down, they really went down. During these multi-hour/day dark periods, IT teams would spring into action—transforming into medical examiners, tracing symptoms, and looking for a diagnosis using haphazard trial-and-error methods. Methods that often started with the most basic step in IT debugging: turn it off and turn it back on again.

Fast forward to today, and outside of the commonality of language and syntax (clusters are still called clusters) there is little to tie the shoddy data centers of yesterday to the perfectly-humming, virtually-managed, self-repairing infrastructure of today. Today, data centers are resilient—measuring downtime % by the basis point. Data centers are beautiful—with the minimalism, symmetry, and exploration of infinity mimicking an Escher drawing.

It’s easy to look 20 years backwards in time and point fingers at the rudimentary state of technology. If we go back 20 more years, this is what we were dealing with. In fact, this examination of the past isn’t meant to call out the gaps of yesterday, but to highlight the speed and momentum that technical progress can bring given the right conditions. The fundamental changes in how data centers are operated during this time period has led to the emergence of an entire multi-billion dollar industry—infrastructure software—and numerous unicorn startups along the way. Companies that were built to address these fundamental gaps are now household names: Splunk, New Relic, VMWare, Mesosphere, Docker, Tanium.

If history can once again repeat itself, than I believe the repetition will play out along the dimension of this analogy:

Factories = Data Centers

The factories of today are the data centers of yesterday. Factories are large rooms of heterogenous, somewhat interdependent machinery with little data tracking or visibility into operations. The overall utilization efficiency of a factory varies from 60%—85%, depending on the industry and who you talk to. When a factory goes down (often for hours or days), an army of operations engineers spring into action and form diagnostic SWAT teams, looking for the root cause and solution. A frequent first step when the line is down: turn it off and turn it back on again.

Although this is an emerging space, parallels to the transformation of data centers present a blueprint for a number of opportunities:

Making Sense of Big Data: Just as Splunk built a way to parse through standard crash logs generated by servers, there is a large amount of data already streaming off machines that isn’t captured, triaged, and flagged the right way. There will be value in the sole step of data collection, normalization, and visualization.

Intelligent Systems: In the same way that VMware built a level of modularity and redundancy in servers, there is an opportunity to build ties in between steps of the manufacturing process. This machine-to-machine communication can help to push factories to act as one entity with feedback loops between stages vs. silo-ed steps of an assembly and QA line.

Worker Empowerment: Within data centers, infrastructure maintenance has shifted from low-skilled machine operations work to high-skilled systems strategy work in operation powerful command-and-control software. Up-leveling factory operations to use analytics suites will help them realize greater utilization and efficiency.
The dimension of impact and “so what?” is where manufacturing breaks out of the data center analogy into a league of its own. When a data center fails, we are greeted by a friendly cartoon whale. When a factory has an undiagnosed problem, faulty product is being generated at a staggering rate. At best case, these products are discovered and thrown away—an enormous waste of energy, water, raw materials, and human labor. At worst case, these faulty products aren’t caught in time and are shipped out to your homes or out on the roads.

Solutions in manufacturing can lead to significant gains in operating efficiency, environmental sustainability, and public safety—while justifying business ROI.

I am incredibly excited about the infrastructural changes that have developed to enable this new “applications” level development in digital manufacturing. The key going forward will be to merge the ingenuity and technical capabilities of entrepreneurs with the experience and know-how of industry veterans.

Identities such as Pittsburgh Steel or Big Auto may not elicit images of hoodied twenty-somethings sketching wireframes over a Chemex, but the intersection and collaboration of these two groups is where magic is bound to happen in the future.

Nan Li

Nan is an experienced investor focused on computational biology.

More from Nan