Mercury and Intel are Making Mobile Platforms Smarter with Artificial Intelligence for Complex Missions
John Bratton
November 13, 2020
Mercury and Intel are close partners who work collaboratively to develop new technologies so powerful that aerospace and defense processing systems can be deployed wherever they are needed to complete more complex and increasingly autonomous missions – regardless of how harsh, contested or cramped their environments. Our combined technologies are taking the best commercial data center capabilities and seamlessly migrating them across fog and edge layers. Required for next-generation autonomous platforms—such as urban air mobility (UAM) and unmanned aerial systems (UASs)—and for smarter fog layers that supply greater environmental awareness and connectivity, decentralized processing resources are enabling big data and artificial intelligence (AI)-powered everything, everywhere. So, how do we do it?
AI Processing at the Edge
Big data and AI processing is increasingly being deployed in edge applications for quick reaction capability and untethered cognitive functionality remote from data center-powered clouds. Nowhere is this more pronounced than in the rapidly emerging and well-funded autonomous platform (ground and air) domain.
As commercial/military, manned/unmanned and fixed/rotor-wing mobile platforms become smarter and more capable, greater on-board AI and big-data processing in general is required to handle the torrents of sensor and situational awareness data for autonomous decision making and effector control. (Effectors being the highly deterministic, reliable and safe vetronics, avionics and other safety- and mission-critical functions required for platform control.) As the number of smart platforms grows, so does the need for a greatly expanded, distributed fog layer with big-data processing capability that safely and efficiently manages the increased traffic.
Common Enterprise Architecture from Cloud to Edge
Big-data processing is still largely confined to static data centers running performance servers powered by the latest Intel processors with increasingly dedicated resources that accelerate AI performance. To scale this capability across fog and edge layers necessitates making data center servers portable and resilient to harsh environments and human attempts to tamper with them. And to deploy this processing power, these servers must be miniaturized, environmentally protected, secure, refreshable and affordable. When all these requirements are met, the composable data center and its AI-processing capability can seamlessly migrate across fog and edge layers creating a common, scalable enterprise architecture. AI processing can then be placed where it is needed, wherever that is, including next to the sensor onboard platforms.
Deploying the Intel-Powered Data Center
Intel’s Xeon Scalable processors with on-die AI accelerators are the gold standard in big data and AI processing that power the most contemporary data centers. They require advanced packaging, miniaturization, cooling and unrestricted I/O pipes to be successfully deployed at and near the edge. Mercury has developed the technologies that efficiently address these deployment requirements. Specifically:
- Rugged packaging – The most powerful AI processors are designed for use in benign data center environments. Each is connected to their mounting substrate via the processor’s retaining clips, which also maintains the physical I/O connectivity. Efficient to implement, this approach is physically vulnerable, and the slightest disturbance can disrupt I/O connectivity, which is not an issue within a data center. Mercury does not use a retaining clip, but instead reflows (solders) the thousands of I/O processor connections to its substrate, then under fills the processor with epoxy, allowing the resulting package to withstand the harshest environments and conditions.
- Miniaturization – System-in-package, wafer-stacking, 2.5D and 3D fabrication techniques are used to shrink the data center server from a 19-inch footprint to smaller form factors with varying degrees of size, weight and power (SWaP) performance. Form factors range from rugged, reduced-profile composable servers used for fog and near-edge processing, to dense, defense-grade OpenVPX format-factor blades, with over a 90% reduction in server volume.
- Cooling – Effective and efficient conduction, air, liquid and hybrid cooling technologies allow small-form-factor Xeon-powered packages to operate reliably and at full throttle, delivering unrestricted processing performance across wide temperature ranges. Platform fuel and refrigerants may be used in our liquid-cooled solutions and advanced air management approaches in our air-cooled solutions, versus the traditional and less-efficient air-blowing approaches used in data centers.
- Unrestricted fabrics – Many open system architectures have performance bottlenecks, especially between their composing module interconnects. Our backplane channel and enhanced interconnect technologies enable unrestricted switch fabric performance across the largest processing systems and temperature ranges, so the processing power and scalability of Intel’s latest processors is not limited.
To avoid vendor lock and add velocity to technology adoption and refreshes, all these Mercury technologies are open-systems compliant. When applied together, they are enabling Intel’s latest AI processing technologies to be deployed in a wide spectrum of environments and form factors, from rugged rackmount servers to extremely rugged OpenVPX server blades and custom small form factors.
Go Anywhere with the Right Security
Mercury has made the physical data center deployable in packages that match platform requirements, enabling them to execute more sophisticated and increasingly autonomous missions in complex environments. For practical deployment, these remote processing capabilities need to be secure and trusted.
Embedded systems security engineering (SSE) protects processing systems from unauthorized technology transfers and alterations to functionality. Our embedded SSE builds in a layered and customizable framework that is required for modern aerospace and defense processing applications. These security frameworks span software, firmware and hardware and are configurable with trusted third-party IP, enabling the creation of private and personalized system-wide security.
The Next Great Age of Innovation
AI is changing and revolutionizing everything. As data center AI processing capability migrates across infrastructural fog layers and is embedded into platforms, they are becoming smarter and more aware. With their new smarts, platforms are becoming more autonomous and able to complete complex missions and tasks. In the age of AI-powered everything, a common data center architecture that transcends physical boundaries is ushering in the next great age of innovation, where the power of AI can be applied everywhere and anywhere. Learn more.