Simulated Surgery Journey

Jonathan R. Holeton
3 min readApr 29, 2024

A few months ago, I saw some amazing and realistic simulated surgery with Unity3D. It inspired me to learn how to do it. I will be writing about certain people and things needed for simulated surgery.

REALISTIC ANATOMY

1. HEART

2. LUNGS

I asked ChatGPT to teach me how to do simulated surgery, and this is what it says:

Creating realistic soft tissue and blood for simulated surgery in Unity 3D involves several steps, from modeling and texturing to implementing physical properties and visual effects. I’ll break this process into manageable parts and provide detailed technical explanations along with some easy-to-understand analogies.

Step 1: Creating Soft Tissue Models

Technical Breakdown:

Soft tissues in the human body — like skin, fat, and muscle — need to be modeled with precision to mimic real-life anatomical structures. In Unity, you can create these models using 3D modeling software like Blender or Maya.

  1. Modeling: Start by creating the base shape of the organ or tissue. This could be as simple as a sphere for something like a gallbladder, or more complex like the multipart structure of a heart.
  2. Texturing: Apply realistic textures to give visual depth and realism. You can use photographs of real tissues as references to create these textures.
  3. Rigging for Movement: If the tissue needs to move or be interactive (like a beating heart), rigging with bones or using Unity’s soft body physics might be necessary.

Analogy:

Think of modeling soft tissue like sculpting with clay. You start with a lump (the basic shape) and refine it until it looks like the organ you’re trying to replicate. Texturing, then, is like painting your sculpture to add all the little details that make it look real.

Step 2: Simulating Realistic Blood

Technical Breakdown:

Blood in a surgical simulation needs to react to other objects and behave in a physically realistic way when incisions are made or when organs are manipulated.

  1. Particle Systems: Unity’s particle system can be used to simulate flowing blood. This involves creating small particles that can flow and pool realistically.
  2. Physics Materials: Apply a non-Newtonian fluid physics material to these particles. Blood is a non-Newtonian fluid, meaning it doesn’t behave like simple water; its viscosity changes under stress.
  3. Shaders and Rendering: Use shaders to give the blood a shiny, wet look that reacts to light as real blood would.

Analogy:

Imagine you’re trying to simulate water flowing from a hose in your garden. Now, instead of water, imagine it’s a bit thicker and stickier, like syrup. This is somewhat how blood behaves, and your task in Unity is to tweak the flow so it mimics this syrupy behavior, pooling where it hits surfaces and flowing along grooves.

Step 3: Implementing Interactivity

Technical Breakdown:

Interactivity is crucial for surgical simulations. Users need to be able to cut through tissues and see realistic responses in the form of tissue separation and blood flow.

  1. Collision Detection: Implement collision detection, so the system knows when a surgical tool touches or cuts through the tissue.
  2. Dynamic Response: Program the tissue to react when it’s cut — this could mean changing the texture at the cut site to show exposed underlying layers or starting a blood particle effect.
  3. Feedback Loops: Provide haptic or audio feedback to simulate the feeling or sound of a real surgical procedure, enhancing the realism.

Analogy:

Think of the tissue as a filled jelly doughnut. If you were to cut through it, you’d expect the jelly to ooze out at the cut, right? Similarly, when cutting through the simulated tissue, the “blood” needs to react appropriately, and the “dough” (tissue) should visibly separate at the incision.

Testing Your Simulation

Once you have these elements in place, test your simulation rigorously to ensure realism and accuracy. Adjust the fluid dynamics, tweak the texture responses, and refine the interaction scripts as necessary.

Surgical Cutting with Position-Based Dynamics

The Researchers who did the above simulations are Iago Berndt,

--

--

Jonathan R. Holeton

#CEO of MindWave Studios Ltd, DriveBuy, Second Nature & VP of BBQ Pic. Love Innovation, Creativity and Enthusiasm. #AI #AR #360video #branding #marketing