0

Can somebody suggest a better way to execute multiple commands within JSON using Python.

For example how to improve the following:

JSON:

$ python -m json.tool test2.json

{
"command": "mkdir -p /home/ec2-user/data2/opt/ODS",
"command1": "mkdir -p /home/ec2-user/data4/opt/ODS",
"command2": "mkdir -p /home/ec2-user/data5/opt/ODS"
}

Python:

$ cat parse.py
   import json
   import sys
   import subprocess
   from pprint import pprint

   jdata = open(sys.argv[1])

   data = json.load(jdata)

   print "start"
   print(data)
   subprocess.call(data['command'], shell=True)
   subprocess.call(data['command1'], shell=True)
   subprocess.call(data['command2'], shell=True)
   print "end"
   jdata.close()

Output:

]$ python parse.py test2.json start {u'command1': u'mkdir -p /home/ec2-user/data4/opt/ODS', u'command2': u'mkdir -p /home/ec2-user/data5/opt/ODS', u'command': u'mkdir -p /home/ec2-user/data2/opt/ODS'} end

TIA

4
  • 1
    Hi daimne, try using a json structure that is an array ( or list ) , not a unordered hash. Then you can just iterate through the list in order and not pay attention to the keys ( command, command1, command2 are superfluous ) Commented Sep 18, 2017 at 18:17
  • I would suggest not doing it at all. If you are going to let your Python script execute arbitrary shell commands, just replace test2.json with a shell script and have Python execute it directly. Commented Sep 18, 2017 at 18:31
  • @Chepner: You could also accomplish the same result with a short shell script an no python code at all. Commented Sep 19, 2017 at 3:42
  • Yes, but I can picture a reason where the shell script is just a small part of what the Python script is doing. I see no reason to package individual lines of a shell script into a JSON file, only to be unpackaged again for execution. Commented Sep 19, 2017 at 12:26

3 Answers 3

1

Here's the structure and code you are looking for:

$ python -m json.tool test2.json
[
    "mkdir -p /home/ec2-user/data2/opt/ODS",
    "mkdir -p /home/ec2-user/data4/opt/ODS",
    "mkdir -p /home/ec2-user/data5/opt/ODS"
]

Python:

$ cat parse.py 
import json
import sys
import subprocess
from pprint import pprint

jdata = open(sys.argv[1])

data = json.load(jdata)

print "start"
print(data)
for command in data:
  subprocess.call(command, shell=True)
print "end"
jdata.close()

Output:

$ python parse.py test2.json
start
[u'mkdir -p /home/ec2-user/data2/opt/ODS', u'mkdir -p /home/ec2-user/data4/opt/ODS', u'mkdir -p /home/ec2-user/data5/opt/ODS']
end
Sign up to request clarification or add additional context in comments.

Comments

0

In your example it would be better, to use os.makedirs.

But if you REALLY need subprocess module - i suggest you to exec command with single subprocess instance. For example:

subprocess.call('; '.join(data.values()), shell=True)

Comments

0

Similar to @Mark's answer, but keeping your original structure:

$ cat parse.py
import json
import sys
import 
from pprint import pprint

jdata = open(sys.argv[1])

data = json.load(jdata)

print "start"
print(data)
for command in data.values():                
  subprocess.call(command, shell=True)
print "end"
jdata.close()

3 Comments

Why do you iterate over keys and values if you need just values ? Dictionaries have .values() method for that
Is the order of value retrieval from the dictionary guaranteed? I think that the commands may not be executed in the order they were intended in the code above.
As long as the dict is not modified, yes. If there is still concern about order use .iteritems(). Docs here.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.