NGSolve GUI eigen mode mesh deformation field

Hi, i would like to visualize the solution for the eigen modes of a solid mesh in my own ui.
For this i put all the needed things like, vertex positions, triangle indices and the mode shape evaluated at mesh vertices on the gpu buffer to do the deformation in the vertex shader with

in vec3 real;
in vec3 imag;
uniform float time_value;
float pi = 3.14159;
float scale = 10.;
vec3 displacement = real*cos(2.*pi*time_value) + imag*sin(2.*pi*time_value);
vec3 r = + displacement*scale;
gl_Position = MCDCMatrix * vec4(r, 1.0);

With this, I get not the same result as by using the deformation checkbox in the ngsolve native ui.

Left is my implementation and right the ngsolve native ui, in my implementation the 2 tips of the tuningfork (left) looks not as the one in the ngsolve ui.
The result looks not completely wrong but different. My question is how is the deformation done in the ngsolve ui ? How can i get same results ?

Thanks for any help :slight_smile:

Hi Kai,

the code snippet looks very reasonable, that’s what we are doing. Difficult to spot the bug.

How does it look like if you use only the real part ? What changes if you change the scale parameter ?

The picture looks like you get different scaling for x and y components of the displacement.


Ok, i found a solution by using a custom filter instead of directly modifing vertex positions in shader. i use vtkProgrammableFilter and it is suprisingly fast

      import vtkmodules.all as vtk
      from vtkmodules.util.numpy_support import vtk_to_numpy, numpy_to_vtk
      self.prgfilter = vtk.vtkProgrammableFilter()
      self.time = 0
      def animateFilter():
          input_data = self.prgfilter.GetInputDataObject(0, 0)
          output_data = self.prgfilter.GetOutputDataObject(0)

          points = vtk_to_numpy(input_data.GetPoints().GetData())
          real = vtk_to_numpy(input_data.GetPointData().GetArray(self.function_name))
          #imag = vtk_to_numpy(input_data.GetPointData().GetArray("eigenmode1"))
          self.time = self.time + 1
          t = self.time / 100.0
          self.time = self.time % 100

          amp = 0.5
          result = points + (real*np.cos(t*2.0*math.pi))*amp;
          #result = real*np.cos(t*2.0*math.pi) + imag*np.sin(t*2.0*math.pi)
          result_array_vtk = numpy_to_vtk(result, deep=True)


now every thing looks correct :slight_smile:

1 Like