How to use export with Python on Linux
Asked Answered
B

13

79

I need to make an export like this in Python :

# export MY_DATA="my_export"

I've tried to do :

# -*- python-mode -*-
# -*- coding: utf-8 -*-
import os
os.system('export MY_DATA="my_export"')

But when I list export, "MY_DATA" not appear :

# export

How I can do an export with Python without saving "my_export" into a file ?

Bookbinding answered 1/10, 2009 at 19:39 Comment(0)
S
116

export is a command that you give directly to the shell (e.g. bash), to tell it to add or modify one of its environment variables. You can't change your shell's environment from a child process (such as Python), it's just not possible.

Here's what's happening when you try os.system('export MY_DATA="my_export"')...

/bin/bash process, command `python yourscript.py` forks python subprocess
 |_
   /usr/bin/python process, command `os.system()` forks /bin/sh subprocess
    |_
      /bin/sh process, command `export ...` changes its local environment

When the bottom-most /bin/sh subprocess finishes running your export ... command, then it's discarded, along with the environment that you have just changed.

Spitler answered 1/10, 2009 at 20:11 Comment(8)
Indeed I do not see it like that !Bookbinding
I just realize, after a lot of test, that it's you who is right : I can't change my shell's environment from a child process (such as Python), it's just not possible.Bookbinding
@KevinCampion Please change the accepted answer in such case.Shingly
hm.. I tried running subprocess.check_output( 'export x=foo && other_people_command_depending_on_x' ) and it didn't work somehow -- any ideas what happens there? Setting os.environ['x'] = 'foo' for Python (and thus all its' child-processes) works.Sciomancy
I recently have to do something similar , here is the issue and what has worked for me. The problem was to execute a python script which internally executes a ELF binary and I wanted a certain path to be set for this binary. The solution that worked for me was to fetch the current path variable from the python code and then just directly update the PATH variable using os.putenv. Though this will not update the PATH variable of the shell from where the python script was originally invoked.Papaverine
export MY_DATA=MY_EXPORT is what I was looking forCabman
the accepted answer is misleading, @KevinCampion need to change the accepted answer based on multiple answers answer1 , answer2, answer3Schaefer
Done @RahulReddyBookbinding
P
100

You actually want to do

import os
os.environ["MY_DATA"] = "my_export"
Prut answered 1/10, 2009 at 19:46 Comment(1)
This doesn't actually work (although it's a nicer way to do this): $ python Python 2.7.10 (default, Sep 8 2015, 17:20:17) [GCC 5.1.1 20150618 (Red Hat 5.1.1-4)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import os >>> os.environ["MY_DATA"] = "my_export" >>> $ export | grep -c MY_DATA 0Scrubber
B
22

Another way to do this, if you're in a hurry and don't mind the hacky-aftertaste, is to execute the output of the python script in your bash environment and print out the commands to execute setting the environment in python. Not ideal but it can get the job done in a pinch. It's not very portable across shells, so YMMV.

$(python -c 'print "export MY_DATA=my_export"')

(you can also enclose the statement in backticks in some shells ``)

Benisch answered 28/1, 2014 at 17:35 Comment(3)
Can others comment as to why this got downvoted? It seems like a reasonable solution given the desired requirements. It doesn't start a new subshell, and does actually add new environment variables to the current, running shell process.Latimore
Actually quite cool. Better that writing a script and souring it later on.Ailin
Indeed, quite cool. This and the more detailed version of @Akhil should be the best answer.Gariepy
H
10

Not that simple:

python -c "import os; os.putenv('MY_DATA','1233')"
$ echo $MY_DATA # <- empty

But:

python -c "import os; os.putenv('MY_DATA','123'); os.system('bash')"
$ echo $MY_DATA #<- 123
Hydrolysate answered 1/10, 2009 at 19:58 Comment(5)
just reminding that if you run the second line many times, the same amount of recursive bash children will be created.Utas
Basically, you just created a new bash instance on top of python which is on top of another bashCountermand
This solution is not correct. In a python script with many commands, the script will exit as the new bash instance is created.Remainderman
Don't do that, creating an entire new bash process just for environment variable is really bad practice.Topaz
That will take you to another shell inside the terminal, so, if you put that inside script that has several other commands after that command they will stuck till you return/exit from that new shell.Pursley
T
2

I have an excellent answer.

#! /bin/bash

output=$(git diff origin/master..origin/develop | \
python -c '
  # DO YOUR HACKING
  variable1_to_be_exported="Yo Yo"
  variable2_to_be_exported="Honey Singh"
  … so on
  magic=""
  magic+="export onShell-var1=\""+str(variable1_to_be_exported)+"\"\n"
  magic+="export onShell-var2=\""+str(variable2_to_be_exported)+"\""  
  print magic
'
)

eval "$output"
echo "$onShell-var1" // Output will be Yo Yo
echo "$onShell-var2" // Output will be Honey Singh

Mr Alex Tingle is correct about those processes and sub-process stuffs

How it can be achieved is like the above I have mentioned. Key Concept is :

  1. Whatever printed from python will be stored in the variable in the catching variable in bash [output]
  2. We can execute any command in the form of string using eval
  3. So, prepare your print output from python in a meaningful bash commands
  4. use eval to execute it in bash

And you can see your results

NOTE Always execute the eval using double quotes or else bash will mess up your \ns and outputs will be strange

PS: I don't like bash but your have to use it

Twinkling answered 18/4, 2019 at 11:37 Comment(1)
It is indeed an excellent answer !! see also the answer by @mikepk, same ideaGariepy
F
2

I've had to do something similar on a CI system recently. My options were to do it entirely in bash (yikes) or use a language like python which would have made programming the logic much simpler.

My workaround was to do the programming in python and write the results to a file. Then use bash to export the results.

For example:

# do calculations in python
with open("./my_export", "w") as f:
    f.write(your_results)
# then in bash
export MY_DATA="$(cat ./my_export)"
rm ./my_export  # if no longer needed
Faltboat answered 6/11, 2020 at 16:24 Comment(0)
B
0

You could try os.environ["MY_DATA"] instead.

Bangup answered 1/10, 2009 at 19:41 Comment(1)
This doesn't answer the question at all, because this doesn't actually export to the current shell.Poikilothermic
Z
0

Kind of a hack because it's not really python doing anything special here, but if you run the export command in the same sub-shell, you will probably get the result you want.

import os

cmd = "export MY_DATA='1234'; echo $MY_DATA" # or whatever command
os.system(cmd)
Zulema answered 12/12, 2017 at 18:17 Comment(0)
G
0

In the hope of providing clarity over common cinfusion...

I have written many python <--> bash <--> elfbin toolchains and the proper way to see it is such as this:

Each process (originator) has a state of the environment inherited from whatever invoked it. Any change remains lokal to that process. Transfering an environment state is a function by itself and runs in two directions, each with it's own caveats. The most common thing is to modify environment before running a sub-process. To go down to the metal, look at the exec() - call in C. There is a variant that takes a pointer to environment data. This is the only actually supported transfer of environment in typical OS'es.

Shell scripts will create a state to pass when running children when you do an export. Otherwise it just uses that which it got in the first place.

In all other cases it will be some generic mechanism used to pass a set of data to allow the calling process itself to update it's environment based on the result of the child-processes output.

Ex:

ENVUPDATE = $(CMD_THAT_OUTPUTS_KEYVAL_LISTS)
echo $ENVUPDATE > $TMPFILE
source $TMPFILE

The same can of course be done using json, xml or other things as long as you have the tools to interpret and apply.

The need for this may be (50% chance) a sign of misconstruing the basic primitives and that you need a better config or parameter interchange in your solution.....

Oh, in python I would do something like... (need improvement depending on your situation)

import re

RE_KV=re.compile('([a-z][\w]*)\s*=\s*(.*)')

OUTPUT=RunSomething(...) (Assuming 'k1=v1 k2=v2')

for kv in OUTPUT.split(' ')
  try:
    k,v=RE_KV.match(kv).groups()
    os.environ[k]=str(v)
  except:
    #The not a property case...
    pass
Generation answered 18/12, 2017 at 20:45 Comment(0)
O
0

One line solution:

eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`
echo $python_include_path  # prints /home/<usr>/anaconda3/include/python3.6m" in my case

Breakdown:

Python call

python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'

It's launching a python script that

  1. imports sysconfig
  2. gets the python include path corresponding to this python binary (use "which python" to see which one is being used)
  3. prints the script "python_include_path={0}" with {0} being the path from 2

Eval call

eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`

It's executing in the current bash instance the output from the python script. In my case, its executing:

python_include_path=/home/<usr>/anaconda3/include/python3.6m

In other words, it's setting the environment variable "python_include_path" with that path for this shell instance.

Inspired by: http://blog.tintoy.io/2017/06/exporting-environment-variables-from-python-to-bash/

Odelle answered 22/2, 2018 at 17:0 Comment(0)
J
0
import os
import shlex
from subprocess import Popen, PIPE


os.environ.update(key=value)

res = Popen(shlex.split("cmd xxx -xxx"), stdin=PIPE, stdout=PIPE, stderr=PIPE,
            env=os.environ, shell=True).communicate('y\ny\ny\n'.encode('utf8'))
stdout = res[0]
stderr = res[1]

Jerk answered 24/6, 2019 at 10:38 Comment(1)
Welcome to SO, Thank you for your contribution, please add some explanation along with the code, which will help SO members to understand your answer better.Np
H
0

If the calling script is python then using subprocess.run is more appropriate. You can pass a modified environment dictionary to the env parameter of subprocess.run.

Here's a step-by-step guide:

1] Import the Subprocess Module: Make sure you have the subprocess module imported in your Python script.

import subprocess
import os

2] Prepare the Environment Variables: Create or modify the environment variables as needed. You can start with a copy of the current environment and then update it with your specific variables.

# Copy the current environment
env = os.environ.copy()

# Set your custom environment variables
env["MY_VARIABLE"] = "value"
env["ANOTHER_VARIABLE"] = "another value"

3] Call subprocess.run with the Custom Environment: Use the env parameter to pass your custom environment to the subprocess.

# Call the subprocess with the custom environment
result = subprocess.run(["your_script.sh"], env=env)

Replace "your_script.sh" with the path to your script or command.

4] Optional: Handle the Result: You can handle the result of the subprocess call as needed, for example, checking if the script ran successfully.

if result.returncode == 0:
    print("Script executed successfully")
else:
    print("Script failed with return code", result.returncode)

Hunnicutt answered 24/11, 2023 at 11:16 Comment(0)
P
-5

os.system ('/home/user1/exportPath.ksh')

exportPath.ksh:

export PATH=MY_DATA="my_export"

Perk answered 20/1, 2012 at 7:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.