With the Orion spacecraft soaring through its mission with flying colors so far, the capsule will soon enter its Distant Retrograde Orbit of the Moon with significant tests of various capsule and service module systems progressing.
Inside Orion, in addition to the myriad of experiments and technology demonstrations, teams from Lockheed Martin, Amazon, and Cisco have been busy working through the Callisto demonstration. Callisto — a combination of Alexa and WebEx video conferencing software — aims to enable the presence of virtual assistants and reliable video conferences on future human deep space flights.
In an interview with NASASpaceflight, Rob Chambers — Director of Strategy and Business Development for Commercial and Civil Space, Lockheed Martin — said, “From a human condition perspective, I’m on the other side of the Moon, I’m physically remote, I’m psychologically remote. My bandwidth is constrained. I have time delays that are only going to get longer as I go on to Mars.”
“How do we improve awareness and interactivity, make people more efficient, stop wasting brain cells on the easy stuff that computers can do?”
That’s what Callisto is designed to test… and much more.
The Alexa taking part in the Callisto demonstration is not able to interface with any critical spacecraft systems like life support or propulsion. “Alexa can’t abort the mission or fire an engine,” noted Chambers. “And rightfully so.”
What Alexa will be tested for on Artemis I falls into the categories of controlling lights inside the Orion capsule (responding to voice commands), correctly accessing spacecraft details to respond with accurate information when queried, and serving as part of a virtual presence device for the video conferencing/whiteboard part of the demonstration.
Overall, there are more than a thousand created utterances that teams built to train Alexa to understand where to look for information within Orion for local information requests.
“If you are on the spacecraft, Alexa needs to know to go to the Orion velocity and the telemetry stream,” noted Chambers.
A larger part of this is that Alexa needs to be able to understand that different phrases can be used to ask for the same information.
For example, “What’s my speed?” and “How fast am I going?” are two different ways of asking for the same data. But when an astronaut on Orion asks either of those questions, they’re actually asking “How fast is Orion going?” Alexa needs to be able to recognize the meaning behind the question and then know which of Orion’s telemetry streams to access to find the desired content.
Oh, hey Commander Campos!
— Lockheed Martin Space (@LMSpace) November 20, 2022
All of this is easier said than done, especially in a spacecraft — which is not the most acoustically friendly environment for a virtual assistant like Alexa.
That element is being tested daily via the use of several different people communicating verbally with the Alexa in Orion from Mission Control during periods of Deep Space Network (DSN) connection and when mission bandwidth allows.
Part of this, too, will involve making sure Alexa can hear properly in the not-acoustically-perfect Orion capsule and can respond properly to different voice patterns and accents.
Another element of consideration is what happens if the information requested can only be gathered by connecting back to the internet via the DSN. And what if Orion is out of communications line-of-sight with the DSN at the time?
Another test area of Callisto includes what Alexa will say during periods of Orion-to-Earth communications dropouts, including providing the time when the blackout will end and the query can be answered.
Alexa’s ability to interface with the internet from lunar orbit is also being put to the test, as is its lag time given the few seconds it takes signals to travel the distance between Earth and Orion.
“If we were on the other side of the Moon, it’s a couple of seconds roundtrip. That’s no big deal. But then you have the switching, you have all of the interfaces and handoffs in the systems,” noted Chambers.
“So if I asked a question, it has to then come back down to Earth to process, get the information from the cloud, send it back to Alexa, and then have Alexa articulate the results. It could be like a 10-second total round trip.”
The human behavior reaction to that lag also arms user interface designers with valuable information on what additional behaviors the virtual assistant can do, such as playing music or beeping while processing information during the lag, to let the human using it understand what it’s doing.
But another critical element to the Callisto demonstration with Alexa is video conferencing. And to do that with the bandwidths available, “cutting-edge image compression” was needed on Cisco’s part for the WebEx system.
“We’re trying to do, in some ways, modern video conferencing over dial-up type speeds. It’s not quite that slow, but [we’re] talking tens of kilobytes or hundreds of kilobytes,” said Chambers.
For this test, Alexa itself has to function as a virtual crewmember via inputs from Mission Control that are separate from the comparatively simple “talk to Alexa” tests.
This is where the off-the-shelf, stock iPad comes into play, with Mission Control able to test video feeds, image compression, choppiness, time delay, and accuracy of video-over-voice priority by simulating a virtual crewmember through Alexa and the iPad.
The video conferencing tests are timed for two hours during the days when the team has access to the 70-meter dishes of the Deep Space Network. Those dishes provide the highest possible bandwidth for the tests while Orion cruises the farthest a human-capable spacecraft has ever traveled from Earth.
“So the way that it works is in Mission Control where we have the Operations Control Center, you’ll be sitting in front of what we call a desktop pro. That’s the Cisco setup. And that has a video camera,” noted Chambers.
“And then on board, we’ve got the speaker and Alexa. And underneath that is mounted the iPad. So when you’re talking in mission control, what you see in the spacecraft is Alexa and the iPad, and when speaking to Alexa, or she’s talking back, the blue ring will glow and your face, of course, will be on the iPad.”
The fourth major test for Callisto also ties to videoconferencing and bit-rate compression in the form of interactive whiteboards between the crew in Orion and controllers back in Houston.
Making space for everyone!
— Lockheed Martin Space (@LMSpace) November 19, 2022
“Let’s say [you’re in Orion and] you load up a picture of the Moon for the landing site,” related Chambers. “You pull up the picture. With [the crew’s] light pen, [the crew] circles where they’re going to land. Here on Earth, you’re kind of erasing that and then marking and zooming it in.”
“So it’s an interactive whiteboard capability. Now it’s got that time delay; we can’t fix physics. But in terms of interacting and collaborating and talking about ‘following this particular trajectory,’ a picture is worth a thousand words.”
Chambers continued, “So we’re able to test that out and confirm there are no glitches, that it’s smooth. It’s testing all through the digital network. The algorithms and protocols are the trick.”
Overall, the Callisto demonstration aims to prove and gather data on how common forms of communication can be carried forward in space exploration, not just to allow the same comforts and familiarities here on Earth to be extended to space, but also for vital, mission-critical communications.
Moreover, the video conferencing element of the demonstration could also be applied on Earth, where government, media, and defense operations could all benefit from compressed bit rates while still maintaining image quality.
(Lead image: The Callisto demonstration in front of the ‘Commander’ mannequin inside the Orion spacecraft during Artemis I. Credit: NASA)