JSON import of types and objects fails

Hello

We recently started evaluating DATAGERRY as a new potential CMDB for our company and encountered a major issue. The installation is as standard as it can be for now following the official docs.

We have created a couple of new data types and adding objects the them via the GUI works fine. Also importing CSV files with proper formatting works as expected. But we can’t do the same with JSON which is a priority since we would like to feed DATAGERRY with output from Ansible/facter.

To reproduce the issue I will start from the beginning with creation of the simplest data type:

I add new section and one Text field:

I choose Hostname in META DATA:

No change in ACCESS:

VALIDATION and save:

New Data Type has been created:

To make sure that the JSON output will be proper I will now simply add an Object to this Data Type:

Everything looks good right?

Now let’s export this Object as JSON:

The output file looks like that:

[
{
“public_id”: 230,
“active”: true,
“type”: “LinuxServer”,
“fields”: [
{
“name”: “hostname”,
“value”: “defralinux123”
}
]
}
]

Now, I assume that I should be able to import is back anytime even without any changes made to it:

Selecting LinuxServer as Data Type:

Doesn’t matter if I type 100 or 1000 in Max element field:

After clicking on import DATAGERRY throws an error:

Although it runs with debug option on there is nothing interesting in log files:

[2020-10-30 12:31:02][INFO ] — Parser <class ‘cmdb.importer.parser_object.JsonObjectParser’> was loaded (importer_object_routes.py)
[2020-10-30 12:31:02][DEBUG ] — {‘start_element’: 0, ‘max_elements’: 0, ‘overwrite_public’: True, ‘type_id’: 10} (importer_object_routes.py)
[2020-10-30 12:31:02][INFO ] — Importer <class ‘cmdb.importer.importer_object.JsonObjectImporter’> was loaded (importer_object_routes.py)

It will be exactly the same if I try to export and import JSON with freshly created Data Type. The same procedure with CSV works perfectly fine with both Data Types and Objects.

We really like DATAGERRY, how simple it is and how smooth it works but without a proper JSON import we can’t continue any further testing.

Thank you.

Hi @sto,

welcome to our community and thank you very much for your detailed bug report. We opened an issue for that and our team is currently working on a bugfix. I’ll keep you updated when we finished the fix.

Thank you Michael, looking forward.

Hi @sto,

good news, we finished the bugfix. It will be included in our next feature release 1.4 (which will be released next week at Nov 13th). If you don’t want to wait, just have a look at the version-1.3 branch. It includes version 1.3.3 + the bugfix. You can download binaries here or use the tag branches_version-1.3 for your docker image.

Hey Michael

Already downloaded and tested :slight_smile:

Got good news and bad news, let’s start with the good ones - importing an object that has been just exported (aka updating existing one) works perfectly. Also after manual modification of JSON file the creation of new object works as expected.

Another good news is the export-import of types also works but only in a certain scenario - you export a type, do not modify result JSON and import it as it is but only with Update option. This scenario works. But if you would select Create option then an exception occurs. I kinda understand it - we are trying to create an exact type that already exists but an information from GUI would be nice instead of an exception.

Another scenario I just tested was changing two things in the type exported JSON:

from:

  {
    "type": "text",
    "name": "domainname",
    "label": "domainname"
  },

to:

  {
    "type": "text",
    "name": "ip",
    "label": "ip"
  }

I did not change the public_id of the type and chose Update which went fine but immediately after clicking on a Dashboard got exception again and now I can’t work with DATAGERRY anymore because any URL throws an exception - cleaning up the Mongo and restarting is the only quick option (not running on production yet).

I can only assume why this happened - since I already added two Objects with “domainnaime” keys and data and updated the Type replacing “domainname” with “ip” now DATAGERRY doesn’t know how to parse and show these existing Objects. I hope that makes sense.

Nonetheless thank you for a really quick fix and looking forward for a new one.

Sto

Hi @sto,

thanks for your fast feedback.

Let me give you some details about the import of types. Of course you can change the fields in the type definition - even if objects of that type already exists. If you change the name of an existing field by editing the JSON structure, please also change the field name in the “render_meta” section and - if it is a summary field in the “summary” section of the JSON structure like in the following example:

In our upcoming 1.4 release, we also added some improvements to the error handling in the JSON importer. If you import a type definition with an existing public_id, in 1.4 you will get an new error message in the UI:

In the future (this may take a while as it currently has not a high priority), we want to improve the import of types, to make sharing types between multiple instances more user friendly. This is tracked in NET-333.

Hope that helps?

Hey @mbatz

Yes, id does, thank you for an explanation.

Any chance to get the current 1.4 development version in binary format for further testing?

Sto

Hi @sto,

yes there is. The binary for the current development version can be downloaded on files.datagerry.com or just use the Docker tag branches_development. Please don’t use it in production :slight_smile: