Large Systems

Up to this point, the idea of the tutorial was to show simple examples that could be reproduced on a desktop computer or a small work-station. But the purpose behind developing KITE was in building a flexible software for systems exceeding billion of atoms. Hence, in the following tutorial we will tackle the simulation of two different systems with their size being in the micrometer range.

Graphene lattice with vacancy disorder

We simulated two graphene vacancy disordered systems, having more than 8.9 (large) and 0.5 (small) billion of atoms. The concentration of vacancy defects is 0.1%, while the number of moments used in the calculation is 15000. The two subplots show the zero energy modes in the density of state that are seen in graphene with diluted concentration of vacancies.
The system in subplot (a) has size of l1 = 65536; l2=65536, while in subplot (b), l1 = 8064; l2=8192. The dashed curve is pristine graphene and solid curves stand for different resolutions: yellow (20 meV), blue (10 meV), green (5 meV), purple (1 meV).

There is an important difference in the two subplots that is related to the lattice size. The limiting factor for observating tiny details in the expanded spectral functions is the mean level spacing given by the system size. This can be seen when simulating a small lattice with high-resolution, as in the case of subplot (b) with 1 meV resolution: oscillations in the DOS are a sign that we are getting into the limit of showing the quantized nature of a finite size system in terms of discrete energy levels. Probing a small system with high resolution can result in non-physical effects. Working with high resolutions (and consequently large systems) is specially important when dealing with localization problems[1].

If you would like to reproduce these results, we suggest you to check the script honeycomb_lat_vacancy.py under the examples. For the info, RAM requirements for running DOS on a small system specified above requires ~5GB and around 20 minutes on a node with 28 cores.

Graphene vacancy

Moiré pattern

The second example is twisted bilayer graphene lattice in the clean limit, with the number of atoms exceeding ~0.7 billion. The model Hamiltonian [2] of such a system has much larger coordination number (average number of neighbors per each atomic site), and the important paramenter when estimating the running time (and the memory requirements) is the “effective” size, the product of the number of sites and the coordination number. In that sense, this system is in the mid range, between the small and the large system of the previous example.

Depending on the rotation between adjacent layers, the resulting moiré pattern will have more or less effect on the low energy electrons in graphene. For high twist angles, van Hove singularities are far from the charge neutrality point. As the angle is decreased, they “move” to lower energies and eventually merge, forming flat bands at the Dirac point. This happens at so called “magic” angles of rotation, which are of great interest recently , as they can provide an interesting playground to study correlations in graphene, with novel states of matter ranging from Mott insulators to superconductors.

Subplots a) and b) are showing the DOS for rotation angles of 2.005 (low) and 13.741 (high) degrees respectively. As in the previous case of disordered graphene, depending on the resolution, fine details around the Dirac point appear. High twist angles give rise to properties more similar to AB graphene. At low angles, the spectrum is much richer, and the linear spectrum of the two layers is fully changed due to the induced interlayer coupling. Apart from the electron-hole asymmetry, at higher energies, both densities follow the shape of the ones of a monolayer. Similar to the previous example, highier resolutions in the calculations lead to the appearance of oscillations in the density of state. Interestingly enough, as the angle is decreased, more states appear in the low energy spectrum, which reduces the mean level spacing. For the same high resolution of 1meV in subplot b) we can see the appearance of oscillations mentioned in the previous section, while the same DOS plot in subplot a) looks very smooth. When considering an arbitrary system, both spectrum and the size reflect on the level of details you are able to distinguish, while obtaining the results with physical meaning.

TBLG clean

Below, you can find a script used to configure the KITE model for twisted bilayer.
The lattice of twisted bilayer graphene (especially at low rotation angles) has a much more complex unit cell compared to simple AB stacked bilayer, where for an arbitrary twisting one does not have the access to all the neighbors using simple neighboring distance vectors. More advanced search is needed for finding the unit cell connections. The search and the hopping calculation is done separately, because of our desire not to make this script too complex. Instead, we use an already defined lattice object that is loaded with a simple command:

# define the angle
angle = 21.787 # or 13.174, 7.341, 2.005
# define the name of the pb.Lattice object
name = 'lattice_tblg_{:.3f}'.format(angle)
#load the lattice
lattice = pb.load(name)

Now, you can continue with specifying the other configuration settings as explained in Getting Started.

The full script can be downloaded from here.

[1] A. Ferreira and E. Mucciolo, Phys. Rev. Lett. 115, 106601 (2015)

[2] P. Moon and M. Koshino, Phys. Rev. B 85, 195458 (2012).

Optical Conductivity: Graphene with Gaussian Disorder

KITE also calculates the optical conductivity of a given lattice for a given Fermi energy. To illustrate this capability, we calculate the optical conductivity of disordered graphene, that can be compared qualitativelly with previous results 1.

Instead of defining the lattice in our python script, we can use of one the pre-defined lattices from pybinding:

from pybinding.repository import graphene

lattice = graphene.monolayer()

To illustrate a different type of disorder, we random on-site energies that follow a Gaussian distribution

disorder = kite.Disorder(lattice)
disorder.add_disorder('B', 'Gaussian', 0.0,1.5)
disorder.add_disorder('A', 'Gaussian',  0.0, 1.5)

where we define the type of statistical distribution, the sublattices they are located, the mean value of the distribution and its width.

After configuring the system, as presented in section getting started, it is time to set the calculation:

calculation = kite.Calculation(configuration)

calculation.conductivity_optical(num_points=1000, num_disorder=1, 
                 num_random=20, num_moments=512, direction='xx') 

The optical conductivity can also be calculated in different directions, which can be quite interesting in the case of Hamiltonians with non-trivial topology that also present transverse optical conductivity. However, in this example we focus on longitudinal optical conductivity. The other quantities that can be set in the python script are the same of the density of states: number of energy poins used in KITE-tools, moments in the expansion, random vectors and disorder realisations.

When calculating the optical conductivity, it is also necessary to set the Fermi energy. However, in this BETA version, the executable KITE-tools only calculates the optical conductivity in one pre-defined Fermi energy. To change it, it is necessary to edit the source code and recompile KITE-tools.

The results of the real and imaginary parts of the optical conductivity presented in the first figure were obtained on a normal desktop with calculations that took 8 minutes for a system with $N=512\times 512$ units cells and 512 expansion moments.

image

For systems sizes of $N=1536\times 1536$ unit cells and two different Gaussian widths, we show $\Re [\sigma_{xx}(\omega)]$ for low frequencies, where we can see Drude’s peak for $\omega\rightarrow 0$ and the onset of interband transitions at $\hbar \omega>2 E_F$2.

image

The complete python script for this calculation can be found here

1 Shengjun Yuan, Rafael Roldán, Hans De Raedt, Mikhail I. Katsnelson, Phys. Rev. B 84, 195418 (2011).

2 T. Stauber, N. M. R. Peres, A. K. Geim, Phys. Rev. B 78, 085432 (2008).

Disordered Haldane model and the transverse conductivity

Haldane Hamiltonian is a single-orbital tight-binding model on a honeycomb lattice with a sublattice-staggered on-site potential (orbital mass) and complex hoppings between next-nearest-neighbor sites that produce a staggered magnetic field configuration with vanishing total flux through the unit cell 1. This model is a Chern insulator (or a quantum anomalous Hall insulator because it hosts integer quantum Hall effect in the absence of an external magnetic field). This characteristic makes Haldane model ideal for illustrating another capability of KITE: the calculation of transverse conductivities.

Let us begin with the definition of the Hamiltonian for the case of pure imaginary next-nearest-neighbor hoppings:

def haldane():
    """Return the lattice specification for haldane model"""
    a = 0.24595 #lattice constant
    a_cc = a/sqrt(3)  #distance between neighbors
    t = -1      #  nearest neighbour hopping
    t2 = 0.1t/    #strength of the complex hopping
    m=0.1 #orbital mass
    # create a lattice with 2 primitive vectors
    lat = pb.Lattice(
        a1=[a, 0],
        a2=[a/2, a/2 * sqrt(3)]
    )

    lat.add_sublattices(
        # name and position
        ('A', [0, -a_cc/2],m), #onsite potential (orbital mass)
        ('B', [0,  a_cc/2],-m)
    )

    lat.add_hoppings(
        # inside the main cell
        ([0,  0], 'A', 'B', t),
        # between neighboring cells
        ([1, -1], 'A', 'B', t),
        ([0, -1], 'A', 'B', t),
        ([1, 0], 'A', 'A', t2 * 1j), #complex next-nearest hop.
        ([0, -1], 'A', 'A', t2 * 1j),
        ([-1, 1], 'A', 'A', t2 * 1j),
        ([1, 0], 'B', 'B', t2 * -1j),
        ([0, -1], 'B', 'B', t2 * -1j),
        ([-1, 1], 'B', 'B', t2 * -1j)
    )

    return lat

With the definition of our model, we can include different types of disorder, as documented here. For simplicity, we consider onsite uniform disorder distribution with width of 0.6 eV and zero average onsite energy (Anderson disorder):

disorder = kite.Disorder(lattice)
disorder.add_disorder('A', 'Uniform', 0.0, 0.6)
disorder.add_disorder('B', 'Uniform', 0.0, 0.6)

Now we are ready to calculate the Hall conductivity. After defining kite.configuration, as explained in Getting Started documentation, we can set kite.calculation. The post-processing tool uses the energy bounds from the density of state to perform the integration in energy, so it is better to couple conductivity with DOS:

calculation.dos(num_points=1000, num_moments=512, num_random=10, num_disorder=1)

calculation.conductivity_dc(num_points=1000, num_moments=256, num_random=50,
                            num_disorder=1, direction='xy', temperature=50)

This is a full spectral calculation where KITEx calculates the coefficients of the Chebyshev expansion and KITEtools uses that moments to calculate the transverse conductivity. Both temperature and num_points are parameters used by KITEtools and it is possible to modify them without running KITEx again. This type of calculation typically requires more RAM memory than DOS or single-shot DC conductivity, which imposes limitations to the sizes of the systems (that still can reach large scales with available memory). The relative errors of the stochastic trace evaluation (STE) scales with the inverse of the system size, which means that full spectrum conductivities typically require more random vectors to decrease the relative error of the STE. The relative error of the STE also depends on the Hamiltonian and the calculated quantities. Transverse conductivities have more fluctuations, at least in part of the spectrum outside the topological gap, and this tutorial illustrates this issue.

Fig. 1 shows the density of states, the longitudinal and transverse conductivity for a small lattice of Haldane model in a calculation that took 3 minutes on a laptop. KITEx captures the anomalous quantum Hall plateau extremely well, with a relative error of less than 0.1%. But it is also clear that the transverse conductivity presents significantly more fluctuations outside the plateau than the longitudinal conductivity, and we already considered 50 random vectors. image1

We now focus on strategies to decrease the fluctuations. Depending on the computational resources, one possibility is increasing the system size. It is also possible to increase the number of random vectors. This is illustrated in Fig. 2.

image

Finally, there are other physical ways of damping them: temperature and disorder. The use of these last two resources depend on the goals of the numerical calculation. In the present case, where we wanted to see the quantum anomalous Hall plateau, we can simply consider Anderson disorder and work with intermediate temperatures. To get a handle of how KITE works, we suggest the user to get the full script for this calculation and play with variations of system size, number of random vectors, disorder and temperature.

1 F. D. M. Haldane, [Phys. Rev. Lett. 61, 2015 (1988)](http://link.aps.org/doi/10.1103/PhysRevLett. 61.2015).

2 J. H. García, L. Covaci, and T. G. Rappoport, Phys. Rev. Lett. 114, 116602 (2015) (Supplementary material)

Phosphorene: DC conductivity along different directions

This is a small tutorial to illustrate the use of KITE to investigate materials with anisotropic electrical conductivity. To this end, we consider a simplified tight-binding model for single layer phosphorene 1. Even though this model is very simple, it captures the anisotropic band structure of phosphorene, which is Dirac like in one direction and Schrödinger like in the other direction. This behaviour results in highly anisotropic transport properties along the different directions 2.

Here, we calculate the single energy longitudinal conductivity (singleshot_conductivity_dc) in the vicinity of the band gap and show that a fast numerical calculation, that is set to run in a normal laptop for about 3-4 minutes, can reproduce qualitatively the expected anisotropic conductivity along xx and yy directions.

Here, we highlight parts of the python scripts. The complete scripts can be downloaded here for the xx conductivity and here for the yy conductivity .

After the imports that are necessary for KITE, we define the lattice, with Pybinding syntax:

def monolayer_4band(num_hoppings=4):
    """Monolayer phosphorene lattice using the four-band model

    Parameters
    ----------
    num_hoppings : int
        Number of hopping terms to consider: from t2 to t5.
    """
    a = 0.222  # nm
    ax = 0.438  # nm
    ay = 0.332  # nm
    theta = 96.79 * (pi / 180)
    phi = 103.69 * (pi / 180)

    lat = pb.Lattice(a1=[ax, 0], a2=[0, ay])

    h = a * sin(phi - pi / 2)
    s = 0.5 * ax - a * cos(theta / 2)
    lat.add_sublattices(('A', [-s/2,        -ay/2, h], 0),
                        ('B', [ s/2,        -ay/2, 0], 0),
                        ('C', [-s/2 + ax/2,     0, 0], 0),
                        ('D', [ s/2 + ax/2,     0, h], 0))

    lat.register_hopping_energies({'t1': -1.22, 't2': 3.665, 't3': -0.205,
                                   't4': -0.105, 't5': -0.055})

    if num_hoppings < 2:
        raise RuntimeError("t1 and t2 must be included")
    elif num_hoppings > 5:
        raise RuntimeError("t5 is the last one")

    if num_hoppings >= 2:
        lat.add_hoppings(([-1,  0], 'A', 'D', 't1'),
                         ([-1, -1], 'A', 'D', 't1'),
                         ([ 0,  0], 'B', 'C', 't1'),
                         ([ 0, -1], 'B', 'C', 't1'))
        lat.add_hoppings(([ 0,  0], 'A', 'B', 't2'),
                         ([ 0,  0], 'C', 'D', 't2'))
    if num_hoppings >= 3:
        lat.add_hoppings(([ 0,  0], 'A', 'D', 't3'),
                         ([ 0, -1], 'A', 'D', 't3'),
                         ([ 1,  1], 'C', 'B', 't3'),
                         ([ 1,  0], 'C', 'B', 't3'))
    if num_hoppings >= 4:
        lat.add_hoppings(([ 0,  0], 'A', 'C', 't4'),
                         ([ 0, -1], 'A', 'C', 't4'),
                         ([-1,  0], 'A', 'C', 't4'),
                         ([-1, -1], 'A', 'C', 't4'),
                         ([ 0,  0], 'B', 'D', 't4'),
                         ([ 0, -1], 'B', 'D', 't4'),
                         ([-1,  0], 'B', 'D', 't4'),
                         ([-1, -1], 'B', 'D', 't4'))
    if num_hoppings >= 5:
        lat.add_hoppings(([-1,  0], 'A', 'B', 't5'),
                         ([-1,  0], 'C', 'D', 't5'))

    lat.min_neighbors = 2
    return lat

This model, as defined above, can be used with different number of hoppings. The user can decide the number that is used in the calculation when defining the lattice:

lattice=monolayer_4band(num_hoppings=4)

To use the large-scale “single-shot” algorithm for direct evaluation of zero-temperature DC conductivities, the resolvent operator requires a nonzero broadening (resolution) parameter eta, which is given in eV. As this type of calculation is energy dependent, it is also necessary to provide a list of desired energy points to the calculation object. In the single shot calculations, the computational time scales linearly with the energy points. For this example, that is intended to run in a normal desktop, we consider a small number of points and the energy range is set in the vicinity of the band gap.

The number of points and the list of energy points can be created when calling the calculation, as illustrated here:

calculation = kite.Calculation(configuration)
calculation.singleshot_conductivity_dc(energy=[(1.0 / 25 * i)*3.5  for i in range(25)],         
                                       num_moments=512, num_random=5, num_disorder=1,
                                       direction='xx', eta=0.02)

Alternatively, one can define the number of points and the energy list outside calculation

npoints=25
epoints=[(1.0 / npoints * i)*3.5  for i in range(npoints)]

calculation.singleshot_conductivity_dc(epoints, num_moments=512, num_random=5,
                                       num_disorder=1, direction='xx', eta=0.02)

Now it is time to save the configuration in a hdf file:

kite.config_system(lattice, configuration, calculation, modification, 'phxx.h5')

It is not possible to request same type of calculation in a single call. In this case, we want to calculate the conductivity in xx and yy directions where the type of the calculation is the same, which means we need another hdf file for yy conductivity.

Let’s repeat the procedure for another direction:

calculation.singleshot_conductivity_dc(epoints, num_moments=512, num_random=5,
                                       num_disorder=1, direction='xx', eta=0.02)
kite.config_system(lattice, configuration, calculation, modification, 'phyy.h5')

For completeness, we provide the two python scripts for both orientations.

The result of this fast calculation can be seen in the figure below, for l1=512, l2=512. To get a feeling of how KITE works, we suggest modifying parameters like eta and num_random.

phosphorene

In the next figure, we repeat the calculation for 300 energy points and 10 random vectors and a large energy window.

image

1 Alexander N. Rudenko, Mikhail I. Katsnelson, Phys. Rev. B 89, 201408 (2014)

2 H. Liu, A. T. Neal, Z. Zhu, X. Xu , D. Tomanek and P. D. Ye, ACS Nano 8, 4033 (2014) .

Editing the HDF file

What is an HDF file?

Some extracts from the HDF group tutorial:

Hierarchical Data Format 5 (HDF5) is a unique open source technology suite for managing data collections of all sizes and complexity.

HDF5 has features of other formats but it can do much more. HDF5 is similar to XML in that HDF5 files are self-describing and allow users to specify complex data relationships and dependencies. In contrast to XML documents, HDF5 files can contain binary data (in many representations) and allow direct access to parts of the file without first parsing the entire contents.

HDF5 also allows hierarchical data objects to be expressed in a natural manner (similar to directories and files), in contrast to the tables in a relational database. Whereas relational databases support tables, HDF5 supports n-dimensional datasets and each element in the dataset may itself be a complex object. Relational databases offer excellent support for queries based on field matching, but are not well-suited for sequentially processing all records in the database or for selecting a subset of the data based on coordinate-style lookup.

Editing the file

As discussed in the postprocessing documentation, it is possible to calculate a quantity at different conditions with the same moments of an expansion. In these case, one need to change some paramenters in the hdf file that are settings for the post-processing tool. By editing the .h5 files, we can change the temperature of a conductivity calculation or the number of points in energy that are wanted.

For that purpose, we provide a simple python script that rewrites specific parts of our .h5 files. As discussed above, the .h5 contains hierarchical data objects that are similar to the structure of directories and files.

When modifying a paramenter like temperature, we need to locate in the .h5 file the quantity that is going to be calculated and modify its temperature. The script describes how to list the parameters associated to each quantity and how to edit one parameter.

file_name = 'archive.h5'
f = h5py.File(file_name, 'r+')     # open the file

# List all groups
print('All groups')
for key in f.keys():  # Names of the groups in HDF5 file.
    print(key)
print()

# Get the HDF5 group
group = f['Calculation']

# Checkout what keys are inside that group.
print('Single group')
for key in group.keys():
    print(key)
print()
#if you want to modify other quantity, check de list and change the subgroup below
# Get the HDF5 subgroup
subgroup = group['conductivity_dc']

# Checkout what keys are inside that subgroup.
print('Subgroup')
for key in subgroup.keys():
    print(key)
print()

new_value = 70
data = subgroup['Temperature']  # load the data
data[...] = new_value  # assign new values to data
f.close()  # close the file

# To confirm the changes were properly made and saved:

f1 = h5py.File(file_name, 'r')
print(np.allclose(f1['Calculation/conductivity_dc/Temperature'].value, new_value))

Post-processing tools

General description

As presented in previous sections, KITE calculates the Chebyshev moments of a given expansion and stores them in the same hdf file that was generated by the configuration script. These moments are used by the post-processing tool named KITE-tools to reconstruct the requested physical quantities specified in the configuration script, such as the density of states or the optical conductivity.

In a basic setup, the python configuration script specifies the desired values for the parameters, which are exported to the hdf file with other settings. In this case, the hdf file contacting the Chebyshev moments calculated by KITE works as an input, KITE-tools read the parameters and the Chebyshev moments from the hdf file and uses them to calculate the desired quantities and export them in data files. If a parameters is not specified, a default value is used.

However, most of these quantities depend on a set of parameters (Temperature, Fermi energy, …) that are not necessary for the calculation of Chebyshev moments. Therefore, the user can modify them and recalculate quantities with KITE-tools without the need of recomputing the Chebyshev moments, the time-consuming part of the calculations. For example: if the user wants to compute the optical conductivity at several Fermi energies, the Chebyshev moments are computed once and KITE-tools can be used to obtain the optical conductivities for all Fermi energies.

For this purpose, the user has the flexibility to override the parameters from the python script by specifying them in the command line interface. As an example, the setting

./KITE-tools archive.h5 --CondOpt -F 1.2 -O -2 3 400

KITE-tools will calculate the optical conductivity with a Fermi energy of 1.2 (in units specified in the configuration file) for 400 values of the frequency in the interval -2 and 3 (in the same units), even if the configuration file present a different values.

Compilation

KITE-tools is compiled in a very similar way to KITE, using the CMake framework for portability.

cd tools/build
cmake ..
make

CMake will automatically search for the required libraries (HDF5 and Eigen3) and generate the adequate Makefile.

Usage

Default usage

Its default usage is very simple:

./KITE-tools archive.h5

where archive.h5 is the HDF file that stores the output of KITE. If KITE-tools does not find this output, it will return an error. The output of KITE-tools is a set of .dat files, one for each of the requested quantities. KITE-tools may be executed without any additional parameters; all the unspecified parameters required for the calculation will be set to sensible default values. At the moment, KITE-tools is able to compute the following quantities:

  • Local density of states (LDOS)
  • Angle-resolved photoemission spectroscopy (experimental) (ARPES)
  • Density of states (DOS)
  • DC conductivity (CondDC)
  • Optical conductivity (CondOpt)
  • Second-order optical conductivity (CondOpt2)

The SingleShot DC conductivity does not require the post-processing through KITE-tools.

Advanced usage

KITE-tools supports a set of command-line instructions to force it to use user-specified parameters for each of the quantities mentioned in the previous section. The syntax is as following:

./KITE-tools archive.h5 --quantity_to_compute1 -key_1 value_1 -key_2 value_2 --quantity_to_compute2 -key_3 value_3 ...

Each function to compute is specified after the double hyphens — and the parameters of each function is specified after the single hyphen -. The list of available commands is as follows:

Function Parameter Description
–LDOS -N Name of the output file
–LDOS -M Number of Chebyshev moments
–LDOS -K Kernel to use (jackson/green). green requires broadening parameter. Example: -K green 0.01
–LDOS -X Exclusive. Only calculate this quantity
.
–ARPES -N Name of the output file
–ARPES -E min max num Number of energy points
–ARPES -F Fermi energy
–ARPES -T Temperature
–ARPES -V Wave vector of the incident wave
–ARPES -O Frequency of the incident wave
–ARPES -X Exclusive. Only calculate this quantity
.
–DOS -N Name of the output file
–DOS -E Number of energy points
–DOS -M Number of Chebyshev moments
–DOS -K Kernel to use (jackson/green). green requires broadening parameter. Example: -K green 0.01
–DOS -X Exclusive. Only calculate this quantity
.
–CondDC -N Name of the output file
–CondDC -E Number of energy points used in the integration
–CondDC -M Number of Chebyshev moments
–CondDC -T Temperature
–CondDC -S Broadening parameter of the Green’s function
–CondDC -d  Broadening parameter of the Dirac delta
–CondDC -F min max num Range of Fermi energies. min and max may be omitted if only one is required
–CondDC -t  Number of threads
–CondDC -I If 0, CondDC uses the DOS to estimate the integration range
–CondDC -X Exclusive. Only calculate this quantity
.
–CondOpt -N Name of the output file
–CondOpt -E Number of energy points used in the integration
–CondOpt -M Number of Chebyshev moments
–CondOpt -T Temperature
–CondOpt -F Fermi energy
–CondOpt -S Broadening parameter of the Green’s function
–CondOpt -O min max num Range of frequencies
.
–CondOpt2 -N Name of the output file
–CondOpt2 -E Number of energy points used in the integration
–CondOpt2 -M Number of Chebyshev moments
–CondOpt2 -R Ratio of the second frequency relative to the first one
–CondOpt2 -P if set to 1: writes all the different contributions to separate files
–CondOpt2 -T Temperature
–CondOpt2 -F Fermi energy
–CondOpt2 -S Broadening parameter of the Green’s function
–CondOpt2 -O min max num Range of frequencies

All the values specified in this way are assumed to be in the same units as the ones used in the configuration file. All quantities are double-precision numbers except for the ones representing integers, such as the number of points. This list may be found in KITE-tools, run

KITE-tools --help

Output

In the table below, we specify the name of the files that are created by KITE-tools according to the calculated quantity and the format of the data file.

Quantity File Column 1 Column 2 Column 3
Local Density of States ldos{E}.dat lattice position LDOS [Re]
ARPES arpes.dat k-vector ARPES [Re]
Density of States dos.dat energy DOS [Re] DOS [Im]
Optical Conductivity optical_cond.dat Frequency Opt. Cond [Re] Opt. Cond [Im]
DC Conductivity condDC.dat Fermi energy Cond [Re] Cond [Im]
Second-order optical conductivity nonlinear_cond.dat Frequency NL Cond [Re] NL Cond [Im]
  • All linear conductivities are in units of e2/h
  • Both Planck’s constant and electron charge are set to 1.
  • LDOS outputs one file for each requested energy. The energy is in the E in the file name.

For more details on the type of calculations performed during post-processing, check Resources where we discuss our method.

The single shot DC conductivity does not need any post-processing as it is an energy dependent calculation where the conductivity is calculated on the fly. In this particular case, the data is extracted directly from the hdf file with the following python script:

import h5py #read h5 files
import numpy as np
file_name = 'archive.h5' #h5 file
file_input = h5py.File(file_name, 'r+')

# extract the single shot
single_shot = file_input['Calculation']['singleshot_conductivity_dc']['SingleShot']
np.savetxt('cond_singleshot.dat', np.c_[single_shot[:, 0], single_shot[:, 1]])

Examples

Example 1

./KITE-tools h5_file.h5 --DOS -E 1024

Processes the .h5 file as usual but ignores the number of energy points in the density of states present there. Instead, KITE-tools will use the value 1024 as specified in the example.

Example 2

./KITE-tools h5_file.h5 --CondDC -E 552 -S 0.01

Processes the .h5 file but uses 552 points for the energy integration and a broadening parameter of 0.01.

Example 3

./KITE-tools h5_file.h5 --CondDC -T 0.4 -F 500

Calculates the DC conductivity using a temperature of 0.4 and 500 equidistant Fermi energies spanning the spectrum of the Hamiltonian.

Example 4

./KITE-tools h5_file.h --CondDC -F -1.2 2.5 30 --CondOpt -T 93

Calculates the DC conductivity using 30 equidistant Fermi energies in the range [-1.2, 2.5] and the optical conductivity using a temperature of 93.

Adding disorder

Adding disorder

The purpose of this section is to provide a simple overview of the different types of disorder that can be added to KITE tight-binding calculations. The general character of our disorder implementation is one of the main features of KITE. To achieve this generality, the implementation follows a basic structure: the user specifies the disorder pattern to be included (that can be constricted to one unit cell or can connect neighboring unit cells) and the disorder pattern is reproduced randomly inside the sample, according to a defined concentration and statistical distribution..

After defining a lattice with the procedure explained in Getting Started), we can add disorder to our system. Usually, disorder can be modeled either as a modification of onsite potentials appearing on the lattice sites or as a combination of onsite potential and bond disorder. Hence, KITE allows the user to select between the two types of disorder by choosing between predefined classes in the python interface. The interface provides two different classes of disorder: * Disorder – onsite disorder with three possible statistical distributions * StructuralDisorder – generic structural disorder, the combination of onsite potential and bond disorder.

Onsite disorder

Disorder adds randomly generated onsite terms at the sites of a desired sublattice based on a certain statistical distribution:

  • Gaussian;
  • Uniform;
  • Deterministic.

Beside the type of statistical distribution, we can select a sublattice type in which the disorder will appear, the mean value and the standard deviation of the selected distribution. To include onsite disorder following a given statistical distribution, we build the lattice and use the following procedure:

disorder = kite.Disorder(lattice) # define an object based on the lattice
disorder.add_disorder('A', 'Gaussian', 0.1, 0.1) # add Gaussian distributed disorder at all sites of a selected sublattice

In a single object it is possible to select multiple sublattices, each of one with different disorder distributions following the rule disorder.add_disorder('sublattice', 'type', mean, std) :

disorder.add_disorder('A', 'Gaussian', 0.1, 0.1)
disorder.add_disorder('B', 'Uniform', 0.2, 0.1)
disorder.add_disorder('C', 'Deterministic', 0.1)

In the case of deterministic disorder, the standard deviation is not set. These quantities are in the same units as the ones specified in the rest of the configuration file.

After defining the desired disorder, it can be added to the configuration file as an additional parameter in the config_system function:

kite.config_system(..., disorder=disorder)

A complete example that calculates the density of states of graphene with different on-site disorder distributions for each sublattice can be seen here:

with the resulting density of states:

image

Structural disorder

StructuralDisorder class adds the possibility of selecting between two different structural disorder types; vacancy, randomly distributed with a certain concentration in sites of a selected sublattice, and a more generic structural disorder which is a combination of onsite terms and bond disorder (also distributed with a certain concentration).

Vacancy disorder

The vacant site distribution can be selected from a single sublattice with a concentration defined in a parent object:

struc_disorder = kite.StructuralDisorder(lattice, concentration=0.2) 

struc_disorder.add_vacancy('B') # add a vacancy to a selected sublattice 

IMPORTANT: to distribute the vacancies in both sublattices, one needs to add the vacancies on each sublattice as a separate object of the class StructuralDisorder (unless you one precisely the same pattern of disorder in both sublatices)

struc_disorder_A = kite.StructuralDisorder(lattice, concentration=0.1)
struc_disorder_A.add_vacancy('A')
struc_disorder_B = kite.StructuralDisorder(lattice, concentration=0.1)
struc_disorder_B.add_vacancy('B')
disorder_structural = [struc_disorder_A, struc_disorder_B]

Structural disorder

Before discussing this class of disorder, it is important to mention that in the pre-release version, it is no possible to perform the automatic scale of the spectra for hopping disorder. In this case, it is necessary to add an extra parameter to the configuration class:

configuration = kite.Configuration(divisions=[nx, ny], length=[lx, ly], boundaries=[True, True],is_complex=False, precision=1,spectrum_range=[-10, 10])

The following example shows a definition of our most general type of disorder, which includes both onsite disorder terms and bond modifications. This type of disorder can be added as an object of the class StructuralDisorder. The procedure for adding the structural disorder is the same of adding a hopping term to the Pybinding lattice object, with a single difference that the bond disorder is not bounded to the hopping term starting from the [0, 0] unit cell, which is the case of the hopping term in pybinding.

For the sake of clarity, let us first define sublattices that will compose the disorder. In this case we are not restricted to a single unit cell:

node0 = [[+0, +0], 'A'] # define a node in a unit cell [i, j] selecting a single sublattice
node1 = [[+0, +0], 'B']
node2 = [[+1, +0], 'A']
node3 = [[+0, +1], 'B']
node4 = [[+0, +1], 'A']
node5 = [[-1, +1], 'B']

After the definition of a parent StructuralDisorder object, we can select the desired pattern:


struc_disorder = kite.StructuralDisorder(lattice, concentration=0.2) # define an object based on the lattice with a certain concentration

struc_disorder.add_structural_disorder(
    # add bond disorder in the form [from unit cell], 'sublattice_from', [to_unit_cell], 'sublattice_to', value:
    (*node0, *node1, 0.5),
    (*node1, *node2, 0.1),
    (*node2, *node3, 0.5),
    (*node3, *node4, 0.3),
    (*node4, *node5, 0.4),
    (*node5, *node0, 0.8),
    # in this way we can add onsite disorder in the form [unit cell], 'sublattice', value
    ([+0, +0], 'B', 0.1)
)
# It is possible to add multiple different disorder types which should be forwarded to the config_system function
# as a list.
another_struc_disorder = kite.StructuralDisorder(lat, concentration=0.6)
another_struc_disorder.add_structural_disorder(
    (*node0, *node1, 0.05),
    (*node4, *node5, 0.4),
    (*node5, *node0, 0.02),
    ([+0, +0], 'A', 0.3)
)

Before exporting the settings to the hdf file, it is possible to define multiple disorder realizations which will be superimposed to the clean system.

The following script has a a minimal example of how to configure the structural disorder

with the resulting density of states

image

Getting started

Code workflow

KITE has 3 different layers: a user interface (Python); a main program (C++); and a post-processing tool (C++). The TB model is built and exported together with the calculation settings to a HDF5 file (*.h5), which is then used as I/O to the main program (called KITEx). The workflow is as follows:

  • Export model and calculation settings to a *.h5 file for KITEx;
  • Run KITEx;
  • Run the post-processing tools and visualise the data;

Below, we illustrate the use of the Pybinding package to build a TB model for a crystal and illustrate the basic funcionalities of KITE. Those already familiar with Pybinding may consider skipping to Examples.

Building and exporting a TB model

KITE’s intuitive user interface is based on Pybinding with be-spoke features for complex disorder/fields modifications and target functions e.g., DoS, conductivity, etc.

Importing the Pybinding package

If all the installation requirements are fulfilled, Pybinding package can be imported in the python script. In this tutorial, the required packages will be included with the following aliases:

import kite 
import pybinding as pb
import numpy as np
import matplotlib.pyplot as plt

If you want to use pybinding predefined styles for visualizing the lattice, simply add to the script:

pb.pltutils.use_style()

Building the model

The most important object for building a TB model is pb.Lattice, which carries the information about the crystal structure (lattice and basis) and hopping parameters. Pybinding also provides additional features based on the real-space information, as for example, the reciprocal vectors and the Brillouin zone. As a simple example, let us consider a square lattice with a single lattice site.

First, import all the packages (including KITE special features):

import kite 
import pybinding as pb
import numpy as np
import matplotlib.pyplot as plt

The following syntax can be used to define primitive lattice vectors:

a1 = np.array([1,0]) # [nm] define the first lattice vector
a2 = np.array([0, 1]) # [nm] define the second lattice vector

lat = pb.Lattice(a1=a1, a2=a2) # define a lattice object

Add the desired lattice sites inside the unit cell:

lat.add_sublattices(
    # make a lattice site (sublattice) with a tuple
    # (name, position, and onsite potential)
    ('A', [0, 0], onsite[0])
)

By default the main cell has the index [n1,n2] = [0, 0]. The hoppings between neighboring sites can be added with the simple syntax:

lat.add_hoppings(
    # make an hopping between lattice site with a tuple
    # (relative unit cell index, site from, site to, hopping energy)
    ([1, 0], 'A', 'A', - 1 ),
    ([0, 1], 'A', 'A', - 1 )
)

Here, the relative indices n1,n2 represent the number of integer steps needed to reach a neighboring cell starting from the main one.

Important note: When adding the hopping (n, m) between sites n and m, the conjugate hopping term (m, n) is added automatically and it is not allowed to add them twice.

Now we can plot the lattice:

lat.plot()
plt.show()

and visualize the Brillouin zone:

lat.plot_brillouin_zone()
plt.show()

For a crystal with two atoms per unit cell see our graphene example. For other examples and pre-defined lattices consult the Pybinding documentation.

Calculation settings and KITE target functions

The following Python classes from KITE are used to define the target functions and calculation settings:

  1. Configuration
  2. Calculation
  3. Modification

Configuration

The class Configuration carries the following information:

  • divisions – integer number that defines the number of decomposition parts in each spatial direction. KITEx implements a domain decomposition technique to divide the lattice into various partitions that are computed in parallel. To activate this feature set a number of decomposition parts larger than unit nx * ny > 1. For example:
nx = ny = 2

decomposes a 2D lattice into four regions of equal size. The product nx * ny equals the number of threads used by KITEx and thus must not exceed the number of avaliable cores in the computer.

The domain decomposition is optimized at the design level and allows a substantial speed up of multithreaded calculations. We recommend its usage.

  • length – integer number of unit cells along the direction of lattice vectors:
lx = 256
ly = 256

The lateral size of the decomposed parts are given by lx/nx and ly/ny that need to be integer numbers.

  • boundaries – boolean value. True for periodic boundary conditions and False for open boundary conditions. Currently, KITEx only accepts periodic boundary conditions.

  • is_complex – boolean value. For optimisation purposes, KITEx only considers and stores complex data with the setting is_complex=True. False should be used for real symmetric Hamiltonians.

  • precision – integer identifier of data type. KITEx allows users to define the precision of the calculation. Use 0 for float, 1 for double, and 2 for long double.

  • spectrum_range – array of reals (OPTIONAL). By default KITE executes an automated rescaling of the Hamiltonian; see Resources. Advanced users can override this feature using spectrum_range=[Emin,Emax], where Emin(Emax) are the minimum (maximum) eigenvalues of the TB matrix.

As a result, a Configuration object is structured in the following way:

configuration = ex.Configuration(divisions=[nx, ny], length=[lx, ly], boundaries=[True, True], is_complex=False, precision=1)

Calculation

Finally, the Calculation object carries out the information about the quantities that are going to be calculated i.e., the CPGF target functions. For this part, we still need to include more parameters, related to the Chebyshev expansion (our examples already have optimized parameters for a standard desktop computer). All target functions require the following parameters:

  1. num_moments defines the number of moments of the Chebyshev expansion and hence the energy resolution of the calculation; see Resources.

  2. num_random defines the number of random vectors for the stochastic evaluation of target functions; see Resources.

  3. num_disorder defines the number of disorder realisations.

The target function functions currently available are:

  • dos
  • conductivity_optical
  • conductivity_dc
  • conductivity_optical_nonlinear
  • singleshot_conductivity_dc

with the following parameters:

  • direction – direction along which the conductivity is calculated (longitudinal: ‘xx’, ‘yy’, transversal: ‘xy’, ‘yx’)
  • temperature – temperature used in Fermi Dirac distribution that is used for the calculation of optical and DC conductivities.
  • num_points – number of points the in energy axis that is going to be used by the post-processing tool to output the density of states.
  • special – simplified form of nonlinear optical conductivity hBN example
  • energy – selected value of energy at which we want to calculate the singleshot_conductivity_dc
  • eta – imaginary term in the denominator of the Green function’s that provides a controlled broadening / inelastic energy scale (for technical details, see Resources).

The calculation is structured in the following way:

calculation = Calculation(configuration)

calculation.dos(num_points=1000, num_random=10, num_disorder=1, num_moments=512)

calculation.conductivity_optical(num_points=1000, num_random=1, num_disorder=1, num_moments=512, direction='xx')

calculation.conductivity_dc(num_points=1000, num_moments=256, num_random=1, num_disorder=1,direction='xy', temperature=1)

calculation.singleshot_conductivity_dc(energy=[(n/100.0 - 0.5)*2 for n in range(101)], num_moments=256, num_random=1, num_disorder=1,direction='xx', gamma=0.02)

calculation.conductivity_optical_nonlinear(num_points=1000, num_moments=256, num_random=1, num_disorder=1,direction='xxx', temperature=1.0, special=1)

Important note: The user can decide what functions are used in a calculation. However, it is not possible to configure the same function twice in the same Python script (HDF5 file).

When these objects are defined, we can export the file that will contain set of input instructions for KITEx:

kite.export_lattice(lattice, configuration, calculation, 'test.h5')

Running the code

To run the code and the postprocess it, use

./KITEx test.h5 
./tools/KITEtools test.h5

Visualizing the data

After calculating the quantity of interest and post-processing the data, we can plot the resulting data with the following script:

!image

If you want to make these steps more automatic, you can use the following Bash script