Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Alexander Stukowski

Pages: [1] 2 3 ... 32
Support Forum / Re: Dislocation in Tensile Test
« on: October 10, 2018, 08:31:12 PM »
Ovito provides an analysis function (see here)  for identifying and tracking dislocation line defects. But it only works for 3D bulk crystals. It cannot be applied to sheet materials or CNTs for that matter.

However, Ovito provides a collection of general tools for analyzing simulations and visualizing defects. Perhaps they can be useful for your purposes. Right now we don't have a good idea what you want to do and how your simulation model looks. If you like, give us some more insight into your problem. Perhaps we can provide some hints how Ovito can be useful.


Support Forum / Re: Problem in executing OVITOS
« on: October 10, 2018, 12:26:54 AM »
Under macOS, the "ovitos" executable may not be fully functional or not even present in the subdirectory under the build directory where you ran "cmake" and "make". Instead, you should find an executable named "ovitos.exe" in that location, which should work. The final "ovitos" executable gets created in the installation directory only when you run the "make install" step.

In any case, before you try running "ovitos", please confirm that the "ovito" graphical program is working normally and that the embedded Python interpreter operates as expected.

Dear Jatink,

I am glad I was able to help a bit.

I am currently on vacation though, that's why I can provide only very limited support at the moment. WHat you described sounds like a more complicated problem that I cannot address before I return to Germany (in a week from now).

Maybe my colleague Constanze will be able to provide you some support in the meantime -or other users in this forum. In general, however, I suggest you contact us directly if you are interested in a scientific collaboration, receiving direct technical support and/or more in-depth training on Ovito's analysis and scripting capabilities.


Support Forum / Re: Problem in executing OVITOS
« on: October 09, 2018, 12:17:36 AM »
Dear ad,

Which version of Ovito did you install? And what is your current macOS version?


Answer to your 1st question is a yes. And the append() instruction adds the user-defined modifier function to the data pipeline. This is similar to what happens when you insert a modifier in the graphical user interface of Ovito. All the modifier functions in the current pipeline get called by the system to process the input data and compute the resulting output data.

There are several ways of counting the O-atoms. In the present case it's easiest to extend our user defined modifier function to emit a second global attribute:

Code: [Select]
def compute_max_z(frame, input, output):
    output.attributes["High_Z"] = numpy.max(input.particles["Position"][:,2])
    output.attributes["Ocount"] = numpy.count_nonzero(input.particles["Particle Type"] == 2)

Replace the literal "2" in this code, which I just used as an example, with the atom type ID for O atoms.

Note that in Ovito the term "property" typically refers to local per-atom information, whereas the term "attribute" refers to a global value, i.e. information not related to individual particles. You would like to calculate an attribute from the local per-atom information and export it to a file.

The ComputePropertyModifier is not the right tool for that as it can only compute new per-atom properties. When calling export_file() with the "txt" output format, Ovito expects you to specify a list of attributes to export. However, "High_Z" computed by the ComputePropertyModifer will be a particle property, not a global attribute. That's why you see the error.

You need to write a user-defined modifier in Python instead. Here is a draft for a script. Note that I cannot test it myself, because I am on vacation without access to a computer running Ovito:

Code: [Select]
import numpy
node = import_file(...)
def compute_max_z(frame, input, output):
    output.attributes["High_Z"] = numpy.max(input.particles["Position"][:,2])
export_file(node, "max_z.txt", "txt", columns=["Frame", "High_Z"], multiple_frames = True)

A for-loop is not needed in this simple case, because export_file() will automatically step through the animation frames thanks to the multiple_frames=True option. The system will evaluate the data pipeline for every animation frame. As part of a pipeline evaluation, the compute_max_z() function will be invoked by the system, which computes the maximum of the z-component of the "Position" particle property, storing the result in a new output attribute with the name "High_Z".

Dear jatink,

The ObjectNode.output field has been deprecated in Ovito 3.0.0. It's still there for some limited backward compatibility, but It only reflects the output of the data pipeline at frame 0 of the loaded simulation sequence.

Instead of

Code: [Select]
coordinates = node.output.particle_properties.position.array

You should now use

Code: [Select]
data = node.compute(frame)
coordinates = data.particles["Position"]


Support Forum / Re: Steinhardt order parameters
« on: October 05, 2018, 05:08:02 PM »
OVITO doesn’t have a function with this name. I’m not sure what is meant with that. Please let me know where you found the term “bond order parameter analysis”. The I can tell you if it is available in OVITO, perhaps under a different name.

Support Forum / Re: How to get hcp orientation out of quartenions?
« on: October 05, 2018, 04:49:08 PM »
Hi Alex,

Please take a look at this updated version of the PTM doc page:

Peter Larsen (user ‘pmla’ in this forum) has included a table that precisely defines the reference orientation for each lattice type. I’m not sure, but I was convinced that for HCP the reference orientation is indeed defined such that [0001] is parallel to the z-axis.

A possible source of error (please check yourself) may be the representation of quaternions. OVITO employs a (x,y,z,w) representation, but other codes use a (w,x,y,z) representation.

If you want to calculate an average orientation for a crystal, you can do that by computing the arithmetic mean of the atomic quaternions as long as their orientations are not close to a boundary of the fundamental zone, where symmetry flips occur. Currently, OVITO doesn’t provide a function for computing an average orientation from the per-atom orientations which takes into account the lattice symmetries. I hope we can add something like this in the future.

Let me mention that the existing DXA function of OVITO also calculates mean orientations of crystallites as a side product of the dislocation identification. These grain orientations are currently not accessible via the public Python API though. The only way to obtain them is to export the DXA results to a .ca file. Here you will find the mean orientation matrices for each crystallite identified by the DXA.



The problem you encountered in "Phase 1" is due to a typo. You ran the command "tar xzf ovito-3.0.0-dev234-x86_64" to extract the archive file, but forgot the ".tar.gz" file extension. Obviously, there already was an existing subdirectory named "ovito-3.0.0-dev234-x86_64", that's why you got that weird error message.

Running "tar xzf ovito-3.0.0-x86_64.tar.gz" should work. You wrote that you used the flags "xvf" instead. But under Ubuntu Linux, the "z" (uncompress) is option for tar.gz archives, and the "v" flag (verbose mode) doesn't affect the behavior. So "xzf" and "xvf" should lead to the same result.

The Ubuntu package repositories contain an outdated program version of OVITO (v2.3.3), which doesn't fully support the CFG file format. That's the version you had installed initially and it's the one that is invoked when you simply run "ovito" in a terminal or if you use the program menu to start up OVITO. Please uninstall this outdated version, e.g. by running "sudo apt-get remove ovito" or using the Ubuntu software center. 

The latest OVITO versions are only available as archive files that need to be manually downloaded from and extracted. It looks like you accomplished this already, aside from the little problem during the extraction step.


I loaded your file in OVITO 2.9.0 and OVITO 3.0.0-dev, without any problems. Which version of the program are you using?


Support Forum / Re: can ovito movie rendering be paralleled?
« on: September 23, 2018, 07:41:30 PM »
Dear Kyu,

Several computational analysis algorithms are parallelized in OVITO, for example CNA, Compute Property, Wigner-Seitz analysis, and more. They will already make use of all processor cores. Also the Tachyon renderer is multithreaded and will use all cores during image rendering. So there is no point in parallelizing the rendering of a movie across frames.

However, what you can do, if you really want to squeeze out the last bit of performance, is to restrict Ovito (or ovitos) to a single processor core using the --nthreads command line option and run multiple Ovito instances in parallel, letting each instance render a different subset of the animation frames. Finally, you have to manually create a video from the saved animation frames using an external video encoding tool of your choice.

Support Forum / Re: dislocation analysis
« on: September 20, 2018, 11:52:10 AM »
I see. So you actually would like to identify dislocations in a hexagonal lattice.

The DXA function in OVITO does not support any 2D crystals, i.e. sheet-like materials like graphene and your hexagonal lattice. The algorithm was only implemented for three-dimensional crystals such as fcc and bcc, where dislocations are 1-dimensional (line-like) defects.

Identifying dislocation in 2d crystals requires a different method, but happens to be much easier, because here dislocations are 0-dimenesional defects. Here is a paper describing a simple algorithm for finding dislocations in 2d lattices:

I cannot give you a ready-to-use code for this, but it should be relatively easy to implement this algorithm in a Python script. Maybe this is something you can do yourself.

Support Forum / Re: dislocation analysis
« on: September 20, 2018, 10:43:35 AM »
Dear Bahman,

I'm not sure if I understand: What exactly is a "2D fcc model"? Face-centered cubic crystal lattices are by definition three-dimensional.


Support Forum / Re: Save the distance between two particles
« on: September 19, 2018, 09:42:38 PM »

Note that in your Python script you are subtracting two particle identifiers instead of two particle positions:

Code: [Select]
distance = np.linalg.norm( input.particles["Particle Identifier"][655] - input.particles["Particle Identifier"][679] )

The expression "input.particles["Particle Identifier"][655]" gives you the numeric ID of the 655th particle in the particles list. Obviously, it doesn't make sense to subtract two IDs. What you want, is the difference between two particle positions. XYZ coordinates are stored in the "Position" particle property:

Code: [Select]
distance = np.linalg.norm( input.particles["Position"][655] - input.particles["Position"][679] )

But note that this will subtract the position of the 679th particle from the 655th particle. But these are probably not the ones you are interested in. There is a difference between particle identifiers and particle indices. The ID is an explicit property of each particle, which sticks to a particle even if the storage order of particles changes or when you delete some particles. The particle index, on the other hand, is an implicit property since the index only specifies the position of the particle within the current list of particles. If you delete some particles using the Slice modifier or the Delete Selected modifier in OVITO, the indices of the remaining particles typically change, because the indices always form a consecutive sequence from 0 to N-1, where N is the current number of particles.

You can find the particle IDs in the "Particle Identifier" column of the particle data inspector (see your screenshot). Particle indices are shown in the left most column of the table.

(As a side note: The simulation code LAMMPS assigns unique IDs to atoms ranging from 1 to N. But during a simulation run, it tends to reorder particles and when they are written to a dump snapshot, the current ordering is typically random (you can change that behavior using the dump_modify sort command). That's why there is generally no well-defined relationship between the ID of a particle and its index, i.e. its position in the list of particles loaded from the dump file).

You want to compute the distance between the two particles with IDs 655 and 679. However, in Python you can access particles only by index! That means you first have to determine the indices of the two particles before you can access their position vectors. You do it as follows:

Code: [Select]
index1 = np.nonzero(input.particles["Particle Identifier"] == 655)[0][0]
index2 = np.nonzero(input.particles["Particle Identifier"] == 679)[0][0]
distance = np.linalg.norm( input.particles["Position"][index1] - input.particles["Position"][index2] )

Note that the np.nozero() function is the typical Numpy way of finding the first occurrence of a particular value (in this case the ID) in an array. In other words, it searched for the value 655 in the ID array and gives back the 0-based index. This index can then be used to access the corresponding value in the coordinates array.

Pages: [1] 2 3 ... 32