Robert Sweet: We just saw and heard some talks about our instruments. We’re talking about robotics, et cetera. And what we’re gonna talk about a little bit today is the role of simulation, and how we can use simulation to help train people with these new devices. And what I’m going to be presenting really represents about five years of work, a team of about 40 people. And it’s really a Department of Defense initiative that we’re doing this project.

The project really involves multiple different entities. So universities, industry, different companies as well as the Department of Defense. And the whole concept is to build a simulation platform or system that allows people out there in the community to build systems that are inter-compatible with each other.

So again, our mission really is to create and disseminate this platform and standards. So the key is that these are free and these are open standards that you can get if you want to build a training system for your device. And then it will then communicate with everybody else’s training systems and other parts of the body that might be relevant to whatever you are building.

So it really ultimately creates an ecosystem for interoperable technologies again, around healthcare simulation. So this is a real game-changer and what we wound up doing is creating basically a physical overview, and I’ll go over some of the details here, where you can have multiple projects, with multiple independent vendors that don’t need to even necessarily talk to each other, but they can all connect through a core computing platform which we’ve designed and are providing the software for free.

And there are even user interfaces for trainers, students, technicians, et cetera, and other virtual devices. So overview of all the deliverables, and this is now coming to fruition. I’ve been hinting towards this over the last, again, three, four years at this meeting and other meetings, are the following. And this is happening in September, October, November, in that range we’ll be closing the project.

What we’ve done is we’ve created male and female anatomic datasets based on CT, MRI, body casting, and laser scanning subjects. And these subjects are younger, military-age individuals. These files are available to you to use. We have created and tested this interoperable platform that again, stimulates the community to develop these anatomically correct modules, feel like human organs, as well as sensors and electronics that can communicate data across the core to other people, to have an effect if necessary.

We also created and tested a universal connector that allows these modules to seamlessly connect with each other and a test scenario to mimic and demonstrate these capabilities. We integrated an open-source physiology platform that exists called BioGears so that your module can connect. You don’t have to build your own physiology, it will already exist and you’ll be able to borrow from that technology and leverage millions of dollars of Department of Defense funding towards your project.

We’ve also integrated and tested multiple physical devices, which I’ll show you. And what we’re doing now is verifying and validating this thing called Advanced Modular Manikin with a collaboration with the American College of Surgeons’ accredited education institutes, which are worldwide training institutes of excellence that have now applied to do this testing.

So the first thing is some of the results of the research methodologies. We took our human subjects, obtained DICOM CTMR, we did an external laser scans, we converted them to 3D CAD models or CAD models, and we’ve done 3D printing, mold making, and casting of these. And again, all these individual organs, this isn’t just an animation, each of these structures is in a dataset that you can obtain and then use to build whatever it is that you want to build.

I’m not going to go into the details of this, but the second thing I’d mentioned is to create this interoperable platform and core, which is basically an operating system. And this is how it works and it communicates. For the engineers in the room, it communicates over DDS, which allows for an enormous amount of data to go back and forth, locally.

We created the third one, which is this universal connector. It had never been done before. So this connector allows you to have data connection, power, fluids, and air without a single tool. You just use your hand to connect and disconnect it. And this again, the CAD, the engineering drawings that you see here are available to you for free, so go ahead and then have a company make them for you.

Here’s one that we did for our demo. It’s connected. The black part is that CAD drawing and it’s around … This happens to be for an extremity. That shows it’s tool-less back and forth. You can easily connect it and disconnect with a push of a button. You have data, power, fluid, air in one connector.

Here’s a figure of part of the core as far as some of the physical. There’s a spine connector, which is a little bit different than the one that you saw there for an extremity. We also have a network router, an example of an IV arm connecting to this core system. At least this is what it looks like. Again, lots of space for what’s really important, which is the systems that we’re working on, especially in urology when you’re talking about the abdomen and pelvis.

We had to do testing of this connector. We needed to look at it and make sure that we would be able to have electronics and fluid together, which could be dangerous. Although we did vibration or, helicopter environment type testing. When that passed, we did pull test, torque tests to very large tolerances, and demonstrated that the connector is sound and won’t break.

We then developed a test scenario. And this is an example of a very early prototype. It actually looks a lot better than this, but it shows a system where this is the first responder that’s putting an IV in. This happens to be me, but just to show that the demonstration here, I’m not a first responder so don’t get your hopes up if you need me. But we also then, subsequently once we put the IV arm, we also have an abdominal exam simulator. Totally separate company, but it works with the system that’s built in. And then I’ll show you how it connects. We brought CAE Healthcare’s ultrasound, FAST exam simulator in and it works. A totally separate company’s airway simulation system. It also works. And then ultimately we’d show that we have different things with that particular system with intubation. Then we have, and we’ll show you a laparotomy model that actually connects into the system as well, and it can communicate things like blood loss, and other important things that are happening to the physiology engine so that your anesthesiologist could train with you.

So I’m doing a trauma surgery with one, a Colonel Rush from Madigan right now and we’re actually doing a splenectomy here. Move on. So this is how really the system works. This is just an example of schematically about how the data bus works. So different modules can communicate and subscribe to different data. So it could give, for example, that fluids being given and it tells the system, “Okay, fluid is being given.” Totally separate company made that module. But then that comes back and tells another module, for example, that that’s what’s happened and therefore they can respond to that. And subsequently, if another module, for example, has blood loss, it also talks to the physiology engine and would affect the IV arm and the pulse, actually on that’s totally different companies IV arm, would go up. So it’s a unified system that allows standards to talk across each other.

The other important thing is that physiology engine, and we had to do a lot of testing. It was an open-source platform and we also added a couple of capabilities with this project that I think are relevant to urology and that is a physiologics response to pain as well as sepsis. The type of data you, this is a very, very small example of what you can do, but you can really pass back and forth different physiologic data elements back and forth between different modules across the system, through the physiology engine.

So what we did really is we were able to integrate and test multiple different physical and digital modules with various vendors. So t’s not just physical. You can take VR modules, you can take virtual patients even and subscribe and gain information back and forth across the system. And these are an example of just some of them that we did and here’s one, for example, this is a company that had an existing abdominal exam simulator. It was a part-task trainer, put the connector on it. Now it’s part of the system. So when I pushed on that system it creates a pain response because it’s a… This patient happens to have an acute abdomen. Now it has skin on top of it. Once we put it there. The point is is it connects to the system, the system recognizes it’s there, pain response affects physiology, physiology can talk what’s happening to everybody else on the system.

This is one that’s being developed for a urinary catheterization module in my lab. This will be brought into the system soon for urethral catheterization. Here’s another one built in our lab for an advanced airway under a separate Department of Defense contract with an IV arm as well as intubation and airway and we’d been working on different tissues, et Cetera, fat tissues. We created a multilayered bladder, spleens, and we built cartridges to swap vessels in and out ultimately to go into this laparotomy module, which is probably the most exciting for this group.

So this is the back of the laparotomy shell, the abdominal pelvic surgery module. So you can see how it can fit seamlessly into the system, just like that simple abdominal exam module did. And this is the other side of it here, which is just an open shell for people to put different organs into and connect to the system, as you can see on the top. this is what we did for our demonstration. It was a trauma model. So here’s the template. This is all simulated. None of this is real tissues, this is all synthetic, of a plate of the retroperitoneal organs and then the fat being placed over it on a template and then ultimately integrated into a laparotomy module. You can do laparoscopy, robotics, but we thought it’d be the most exciting because we’ve never ever had a neurologic laparotomy open simulation trainer ever. So this, this would be very novel for us and really advance our opportunities.

Here’s for example, some of the bladders that we actually made an intraperitoneal bladder rupture here. So cystorrhaphy was one of the things as a urologist I had to sneak something urologic into this project. The DOD has its priorities, but I thought it was important we do something urologic, otherwise, it probably wouldn’t be standing in front of you today. So we built this as a bladder and a prostate system that can pop right in with the connector, and swap it in and out into the particular system and it’s very realistic properties and you can sew on it, et cetera. So here we’re actually doing this, the traumas cystorrhaphy as it’s integrated into the system. I’m doing that with Colonel Rush, I’m doing the bladder part at this point where he’s doing the splenectomy and I’ll tell you, it was extremely engaging. Both of us got a lot out of the system. It’s hard enough to get even one attending surgeon interested in a simulator. We actually had two on this particular system.

And then we actually had a vena cava injury purposely that they had to repair and this is just a screenshot of that again simulated vena cava injury. This is all synthetic materials. We also needed to do a female trauma manikin as well and in the military, this has become a real priority because female soldiers actually were dying at a higher rate from preventable injuries. A lot of it was because their male counterparts were afraid to expose the chest. They found that and so they really wanted a trauma manikin for females for things like chest tube placement. So that type of thing didn’t happen and we… Here’s our connector, it’s connected to a totally different company’s system. You can see it has physiology, et cetera, and this one is for chest tube, very realist- We put some really very realistic breathing motion into this particular manikin and system for training.

Now what we’re transitioning to is this Verification & Validation. As I mentioned to the AMM to the American College of Surgeons, and we’re in working with them closely. I’m proud to announce publicly for the first time that the ACS, they did put an RFA out to all the 92 accredited education institutes around the world, and they did select three, San Diego Naval Hospital, Penn State and CSTAR at Western Ontario University were the winners and they will be performing studies of the system and the platform and its ability to train and assess.

Other thing we had to do as a deliverable is to make this available and so there’s a website available for this project that you can go on and you can gain access to these systems and modules. Again, if you’re building a new device and you want a training system that works with the system, especially if it’s important that you’re talking to anesthesiology during the particular thing, this is probably be a good thing to think about.

So in conclusion, as a whole, this industry has very limited resources and for everyone to think that they’re going to build their own system, is just unrealistic. Why not harness the power of the crowd? Create an ecosystem that makes things interoperable so that people can, and individual people in building systems, can shift their resources towards increasing educational value, expanding technical capabilities, and that ultimately broadens the market of applications that simulation will have in healthcare. Really what it focuses on, innovation where it actually matters, not in reproducing things that may not even be as good as what exists now. Ultimately, it’s the future of simulation in healthcare. Thank you very much.

X