3

I need to build the following JSON structure dynamically.

json = {
    "mainkey":"val1",
    "key2":[
   {"keya":"val1rec1","keyb":"val2rec1","keyc":"val3rec1"},
   {"keya":"val1rec2","keyb":"val2rec2","keyc":"val3rec2"},
   {"keya":"val1rec3","keyb":"val2rec3","keyc":"val3rec3"},
   {"keya":"val1rec4","keyb":"val2rec4","keyc":"val3rec4"},
   {"keya":"val1rec5","keyb":"val2rec5","keyc":"val3rec5"}
            ]
       }

only the "{"keya":"val1rec1","keyb":"val2rec1","keyc":"val3rec1"}," rows "iterate" - ie, reading values from a CSV file and then populating/creating the rows based on what is inside a CSV file.

So my pseudo code looks something like this:

#create dict
path = 'somewhere\on\my\disk\file.csv'
json_file = {}
json_file['mainkey'] = "val1" 
#read from CSV file
df1 = pd.read_csv(path, header=None)
    #iterate through csv
for row,s in df1.iterrows():
    number = df1.loc[row,0]
    #I'm reading keyb and keyc values from CSV as well, but for brevity my substitution below is not showing that.... 
    json_file['key2'] = "'keya':'"+str(number)+"','keyb':'whatever','keyc':'whatever'"
print (json_file)

It obviously fails to produce what I'm looking for above - hence my post here for assistance.

2
  • Could you provide the csv-file you're working with? At least a sample. Commented Aug 29, 2018 at 9:09
  • After further looking through your code it looks like you try to build the json object structure manually for key2 but use proper dict-structure prior to that. See my answer for further details. Commented Aug 29, 2018 at 9:21

2 Answers 2

2

It looks like you're trying to construct a json-encoder manually, this is unecessary since there's a great json-encoder built into python.

I'd recommend building up your dict using the native data structre and use the builtin json-utilites. This will both produce cleaner more maintainable code and is less error prone.

Like this:

import json
# ... Other imports you may have such as pandas


path = "somewhere\on\my\disk\file.csv"
# Initialize dict
data = {"mainkey": "val1", "key2": list()}

# Parse CSV file
df1 = pd.read_csv(path, header=None)
# iterate through csv
for row,s in df1.iterrows():
    number = df1.loc[row,0]

    # I'm reading keyb and keyc values from CSV as well, 
    # but for brevity my substitution below is not showing that.... 
    data['key2'].append({
        "keya":number,
        "keyb":"whatever",
        "keyc":"whatever",
    })

# Print json to stdout/terminal
json_data = json.dumps(data)
print(json.dumps(data, sort_keys=True))

# Save json to file (data.json)
with open("data.json", "w") as output:
    json.dump(data, output, sort_keys=True)
Sign up to request clarification or add additional context in comments.

5 Comments

Thanks, I think I understand your suggestion - however, when I do this, the json is not in the correct order - the "mainkey" is printed as the last key in the file?
This shouldn’t matter since hashmaps/dicts are unordered. However in Python 3.6+ dicts are sorted by default.
Okay so I did a little bit more digging for you, if you want it ordered you can pass sorted_keys=True to json.dump(s). If using python 3.6/3.7 that should be enough, if you're using and earlier version you'll have to use an OrderedDict. You can also control indention with indent=X. See my updated answer for a sorted implementation.
If this doens't work for you please provide which version of python you're using otherwise I won't be able to further help yo.
Thank you. Also figured out that the order doesn't matter, so your solution works fine.
1

You are overwriting key2 values while you should append them to the list:

json_file['key2'] = []
for row,s in df1.iterrows():
    number = df1.loc[row,0]
    json_file['key2'].append({'keya': str(number), 'keyb': 'whatever', 'keyc': 'whatever'})
print (json_file)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.