CarpetX

From Einstein Toolkit Documentation
Revision as of 19:40, 8 February 2021 by Scupp (talk | contribs) (Open tasks/improvements)
Jump to: navigation, search

The folllowing is the work done by S. Cupp on the CarpetX framework.

CarpetX Interpolator

  • Added the interface to connect Cactus' existing interpolation system to the CarpetX interpolator
  • AHFinder interpolation test was extended to compare the results from directly calling CarpetX's interpolator with the results from calling the new interface.

CarpetX Arrays

  • Added support for distrib=const arrays in CarpetX
  • Combined the structures for arrays and scalars into a single struct
  • All scalar code was extended to be able to handle both scalars and arrays
  • Added test in TestArray thorn to verify that array data is allocated correctly and behaves as expected when accessed

CarpetX DynamicData

  • Overloaded the DynamicData function for use with CarpetX
  • Added the necessary data storage in the array group data to provide the dynamic data when requested
  • Added test in TestArray thorn to verify that dynamic data for grid functions, scalars, and arrays returns the correct data
  • Test also serves as basic test for read/write declarations, as all three variable types are written and read in this test

Multipole/qc0

  • Multipole incorporated into cactusamrex repository
  • Test produces data that matches old code (be sure to use the same interpolator)
  • A bug in TwoPunctures was found and fixed
  • Work towards simulating qc0 is in progress

Open issues/bugs

  • Something seems to be wrong with the cell- and vertex-centered code in the interpolator. The multipole code experiences this if the index tag is removed from the harmonic grid functions. Roland has been working on fixing this issue, but it is not in the Main branch yet that I know of.
  • storage being always-on in CarpetX results in attempted regridding,etc. that causes validity failures. This is caused by the expected number of time levels not matching the actual number of time levels.
  • If there are too few cells, an unclear error appears which boils down to "assertion 'bc=bcrec' fails". This is because the cells are so large w.r.t. the number of points that multiple physical boundaries are within one cell. Identifying this as the problem and generating a different error message is recommended. Alternatively, the code could be altered to allow for these kinds of runs to function, though I am not sure that this would be worthwhile if it takes much effort/time.
  • CarpetX seems to use substantial resources for the qc0 test, and we are unsure of why. The amount of memory is several times higher than Roland's estimates for the usage with the given grid size.

Open tasks/improvements

  • SymmetryInterpolate isn't hooked up yet, but commented out code provides a starting point
  • Currently, CarpetX and Cactus both have parameters for interpolation order. As an example, qc0.par has to set "CarpetX::interpolation_order = 3" and "Multipole::interpolator_pars = "order=3" ". These should be condensed into a single parameter. Since individual thorns are setting their own interpolation order, I am assuming that different orders can be chosen for different thorns. If different variables have different orders of interpolation, the current implementation would break. Instead of using its own parameter, CarpetX should work with the existing infrastructure.
  • CarpetX's interpolate function doesn't return error codes, but historically there have been error codes for the interpolator. The new interpolate function should incorporate the old error codes to fully reproduce the functionality of the old interpolator
  • Interpolation interface should print out error codes for TableGetIntArray()
  • Distributed arrays are still not supported. It is unclear where these are used (at least to me). However, if they are needed, CarpetX will need to be extended to support them.
  • The DynamicData test revealed a bug with the scalar validity code. As such, we should consider whether we need more tests which specifically test the functionality of the valid/invalid code for the various types of variables. For example, tests for validating the poison routine, NaN checker, etc.