Discussion Closed This discussion was created more than 6 months ago and has been closed. To start a new discussion with a link back to this one, click here.
Deformation of a sphere
Posted 12.11.2009, 11:16 GMT-5 1 Reply
Please login with a confirmed email address before reporting spam
I need to evaluate the change in radius of a sphere deformed by its own weight and supported on three points in the "south" emisphere.
Here is how I define my model:
I first create a sphere in 3D, centred in {0 0 0}, with radius, say, 1m.
I create a point in {-1 0 0} then I rotate it -45° around y axis. I then copy this point twice rotating it 120° around z axis.
In subdomain settings, I apply a material to it. I then apply a load in -z direction, equal to 9.8 m/s^2 multiplied by the material density. This way I simulate the gravity acting on the object.
In points settings, I constrain in x, y and z direction the three points I created before. This way I simulate my ideal contact points with my supporting structure.
For the mesh parameters, I use the ExtraFine meshing.
As a solver, I simply choose the Pardiso solver.
To test the isotropy of the problem, I then rotate the sphere (and only the sphere!) by subsequent steps of 10° around the Y axis, repeating again the simulation, with all the other parameters left unvaried.
For the solution, I look at DeltaR=sqrt(u^2+v^2) on the slice defined by the equator of the sphere, i.e the plane containing x and y axes. This is equal to look for the change in radius of the sphere.
First, I expect the centre of the sphere to have DeltaR almost equal to 0 (as the problem is symmetric). Also, I expect the solution not to change when I rotate the sphere (apart from, eventually, small numerical changes). This is not the case, as the difference in the evaluated DeltaR is often one or more orders of magnitude between different solutions.
I would like to understand if this is simply a meshing problem; it seems not, since if I solve the problem with a rough meshing, same differences appear.
Does anybody have a clue of how to solve this problem?
Here is how I define my model:
I first create a sphere in 3D, centred in {0 0 0}, with radius, say, 1m.
I create a point in {-1 0 0} then I rotate it -45° around y axis. I then copy this point twice rotating it 120° around z axis.
In subdomain settings, I apply a material to it. I then apply a load in -z direction, equal to 9.8 m/s^2 multiplied by the material density. This way I simulate the gravity acting on the object.
In points settings, I constrain in x, y and z direction the three points I created before. This way I simulate my ideal contact points with my supporting structure.
For the mesh parameters, I use the ExtraFine meshing.
As a solver, I simply choose the Pardiso solver.
To test the isotropy of the problem, I then rotate the sphere (and only the sphere!) by subsequent steps of 10° around the Y axis, repeating again the simulation, with all the other parameters left unvaried.
For the solution, I look at DeltaR=sqrt(u^2+v^2) on the slice defined by the equator of the sphere, i.e the plane containing x and y axes. This is equal to look for the change in radius of the sphere.
First, I expect the centre of the sphere to have DeltaR almost equal to 0 (as the problem is symmetric). Also, I expect the solution not to change when I rotate the sphere (apart from, eventually, small numerical changes). This is not the case, as the difference in the evaluated DeltaR is often one or more orders of magnitude between different solutions.
I would like to understand if this is simply a meshing problem; it seems not, since if I solve the problem with a rough meshing, same differences appear.
Does anybody have a clue of how to solve this problem?
1 Reply Last Post 14.11.2009, 04:19 GMT-5