How to import a Python class that is in a directory above?

Posted on

Question :

How to import a Python class that is in a directory above?

I want to inherit from a class in a file that lies in a directory above the current one.

Is it possible to relatively import that file?

Answer #1:

from ..subpkg2 import mod

Per the Python docs: When inside a package hierarchy, use two dots, as the import statement doc says:

When specifying what module to import you do not have to specify the absolute name of the module. When a module or package is contained within another package it is possible to make a relative import within the same top package without having to mention the package name. By using leading dots in the specified module or package after from you can specify how high to traverse up the current package hierarchy without specifying exact names. One leading dot means the current package where the module making the import exists. Two dots means up one package level. Three dots is up two levels, etc. So if you execute from . import mod from a module in the pkg package then you will end up importing pkg.mod. If you execute from ..subpkg2 import mod from within pkg.subpkg1 you will import pkg.subpkg2.mod. The specification for relative imports is contained within PEP 328.

PEP 328 deals with absolute/relative imports.

Answered By: gimel

Answer #2:

import sys
sys.path.append("..") # Adds higher directory to python modules path.
Answered By: Sepero

Answer #3:

@gimel’s answer is correct if you can guarantee the package hierarchy he mentions. If you can’t — if your real need is as you expressed it, exclusively tied to directories and without any necessary relationship to packaging — then you need to work on __file__ to find out the parent directory (a couple of os.path.dirname calls will do;-), then (if that directory is not already on sys.path) prepend temporarily insert said dir at the very start of sys.path, __import__, remove said dir again — messy work indeed, but, “when you must, you must” (and Pyhon strives to never stop the programmer from doing what must be done — just like the ISO C standard says in the “Spirit of C” section in its preface!-).

Here is an example that may work for you:

import sys
import os.path
    os.path.abspath(os.path.join(os.path.dirname(__file__), os.path.pardir)))

import module_in_parent_dir
Answered By: Alex Martelli

Answer #4:

Import module from a directory which is exactly one level above the current directory:

from .. import module
Answered By: Sepero

Answer #5:

How to load a module that is a directory up

preface: I did a substantial rewrite of a previous answer with the hopes of helping ease people into python’s ecosystem, and hopefully give everyone the best change of success with python’s import system.

This will cover relative imports within a package, which I think is the most probable case to OP’s question.

Python is a modular system

This is why we write import foo to load a module “foo” from the root namespace, instead of writing:

foo = dict();  # please avoid doing this
with open(os.path.join(os.path.dirname(__file__), '../') as foo_fh:  # please avoid doing this
    exec(compile(, '', 'exec'), foo)  # please avoid doing this

Python isn’t coupled to a file-system

This is why we can embed python in environment where there isn’t a defacto filesystem without providing a virtual one, such as Jython.

Being decoupled from a filesystem lets imports be flexible, this design allows for things like imports from archive/zip files, import singletons, bytecode caching, cffi extensions, even remote code definition loading.

So if imports are not coupled to a filesystem what does “one directory up” mean? We have to pick out some heuristics but we can do that, for example when working within a package, some heuristics have already been defined that makes relative imports like .foo and work within the same package. Cool!

If you sincerely want to couple your source code loading patterns to a filesystem, you can do that. You’ll have to choose your own heuristics, and use some kind of importing machinery, I recommend importlib

Python’s importlib example looks something like so:

import importlib.util
import sys

# For illustrative purposes.
file_path = os.path.join(os.path.dirname(__file__), '../')
module_name = 'foo'

foo_spec = importlib.util.spec_from_file_location(module_name, file_path)
# foo_spec is a ModuleSpec specifying a SourceFileLoader
foo_module = importlib.util.module_from_spec(foo_spec)
sys.modules[module_name] = foo_module

foo = sys.modules[module_name]
# foo is the sys.modules['foo'] singleton


There is a great example project available officially here:

A python package is a collection of information about your source code, that can inform other tools how to copy your source code to other computers, and how to integrate your source code into that system’s path so that import foo works for other computers (regardless of interpreter, host operating system, etc)

Directory Structure

Lets have a package name foo, in some directory (preferably an empty directory).

some_directory/  # `if __name__ == "__main__":`  lives here

My preference is to create as sibling to, because it makes writing the file simpler, however you can write configuration to change/redirect everything setuptools does by default if you like; for example putting under a “src/” directory is somewhat popular, not covered here.



#!/usr/bin/env python3

import setuptools



python3 -m pip install --editable ./  # or path/to/some_directory/

“editable” aka -e will yet-again redirect the importing machinery to load the source files in this directory, instead copying the current exact files to the installing-environment’s library. This can also cause behavioral differences on a developer’s machine, be sure to test your code!
There are tools other than pip, however I’d recommend pip be the introductory one 🙂

I also like to make foo a “package” (a directory containing instead of a module (a single “.py” file), both “packages” and “modules” can be loaded into the root namespace, modules allow for nested namespaces, which is helpful if we want to have a “relative one directory up” import.



#!/usr/bin/env python3

import setuptools


I also like to make a foo/, this allows python to execute the package as a module, eg python3 -m foo will execute foo/ as __main__.

    foo/  # `if __name__ == "__main__":`  lives here, `def main():` too!


#!/usr/bin/env python3

import setuptools

        'console_scripts': [
            # "foo" will be added to the installing-environment's text mode shell, eg `bash -c foo`

Lets flesh this out with some more modules:
Basically, you can have a directory structure like so:

some_directory/           # `import bar`
    foo/  # `import foo`       # `import foo.baz
    # `import foo.spam`
        # `import foo.spam.eggs` conventionally holds metadata information about the source code within, such as:

  • what dependencies are needed to install named “install_requires”
  • what name should be used for package management (install/uninstall “name”), I suggest this match your primary python package name in our case foo, though substituting underscores for hyphens is popular
  • licensing information
  • maturity tags (alpha/beta/etc),
  • audience tags (for developers, for machine learning, etc),
  • single-page documentation content (like a README),
  • shell names (names you type at user shell like bash, or names you find in a graphical user shell like a start menu),
  • a list of python modules this package will install (and uninstall)
  • a defacto “run tests” entry point python ./ test

Its very expansive, it can even compile c extensions on the fly if a source module is being installed on a development machine. For a every-day example I recommend the PYPA Sample Repository’s

If you are releasing a build artifact, eg a copy of the code that is meant to run nearly identical computers, a requirements.txt file is a popular way to snapshot exact dependency information, where “install_requires” is a good way to capture minimum and maximum compatible versions. However, given that the target machines are nearly identical anyway, I highly recommend creating a tarball of an entire python prefix. This can be tricky, too detailed to get into here. Check out pip install‘s --target option, or virtualenv aka venv for leads.

back to the example

how to import a file one directory up:

From foo/spam/, if we wanted code from foo/baz we could ask for it by its absolute namespace:

import foo.baz

If we wanted to reserve capability to move into some other directory in the future with some other relative baz implementation, we could use a relative import like:

import ..baz
Answered By: ThorSummoner

Answer #6:

Here’s a three-step, somewhat minimalist version of ThorSummoner’s answer for the sake of clarity. It doesn’t quite do what I want (I’ll explain at the bottom), but it works okay.

Step 1: Make directory and


In, write:

import setuptools


Step 2: Install this directory as a package

Run this code in console:

python -m pip install --editable filepath_to/project_name

Instead of python, you may need to use python3 or something, depending on how your python is installed. Also, you can use -e instead of --editable.

Now, your directory will look more or less like this. I don’t know what the egg stuff is.


This folder is considered a python package and you can import from files in this parent directory even if you’re writing a script anywhere else on your computer.

Step 3. Import from above

Let’s say you make two files, one in your project’s main directory and another in a sub directory. It’ll look like this:

    subdirectory/          |
    test_3.egg-info/  |----- Ignore these guys
        ...           |

Now, if looks like this:

x = 1

Then I can import it from, or really any other file anywhere else on your computer.

#  OR

import random # This is a standard package that can be imported anywhere.
import top_level_file # Now, works similarly.


This is different than what I was looking for: I hoped python had a one-line way to import from a file above. Instead, I have to treat the script like a module, do a bunch of boilerplate, and install it globally for the entire python installation to have access to it. It’s overkill. If anyone has a simpler method than doesn’t involve the above process or importlib shenanigans, please let me know.

Answered By: JDG

Answer #7:

Python is a modular system

Python doesn’t rely on a file system

To load python code reliably, have that code in a module, and that module installed in python’s library.

Installed modules can always be loaded from the top level namespace with import <name>

There is a great sample project available officially here:

Basically, you can have a directory structure like so:

the_foo_project/           # `import bar`
    foo/    # `import foo`         # `import foo.baz`

      faz/           # `import foo.faz`       # `import foo.faz.daz` ... etc.


Be sure to declare your setuptools.setup() in,

official example:

In our case we probably want to export and foo/, my brief example:

#!/usr/bin/env python3

import setuptools

        # Note, any changes to your, like adding to `packages`, or
        # changing `entry_points` will require the module to be reinstalled;
        # `python3 -m pip install --upgrade --editable ./the_foo_project


Now we can install our module into the python library;
with pip, you can install the_foo_project into your python library in edit mode,
so we can work on it in real time

python3 -m pip install --editable=./the_foo_project

# if you get a permission error, you can always use 
# `pip ... --user` to install in your user python library


Now from any python context, we can load our shared py_modules and packages

#!/usr/bin/env python3

import bar
import foo

Answered By: ThorSummoner

Leave a Reply

Your email address will not be published. Required fields are marked *