Any directory with __init__.py
is considered a package in Python.
Any python files inside a package is considered a module.
Modules contain functions and other bindings that is always exported.
If you are outside the package, and you want to import a module from a package:
from package import module
module.use_function()
However this is only because the __init__.py
is empty. We can allow the
package to export functions and modules as well. Which means this should be
possible:
import package
package.module.use_function()
package.use_function()
To do this, the package's __init__.py
must contain something like:
from . import module
from .mod import *
Therefore the __init__.py
is kind of like an exporting script for the package.
Similar to how I use index.js
in my JavaScript packages. This means you can
use the package __init__.py
as a sort of staging point for all exports. And
anything that isn't exported here is hidden. Unless the user really wants to
acquire it, in which case they can explicitly use the module.
To actually have encapsulation at the module level. You need to use _
prefix
on your bindings. This is different from __
prefix used in classes. Note that
using the *
import will ignore module bindings that have _
prefixed. But
they can still be accessed explicitly if the module is directly accessed.
See: https://stackoverflow.com/a/1547160/582917
I think this form of using __init__.py
is the best way. Users shouldn't need
to use the from ... import
syntax unless they need to import specific bindings
and don't want to use qualified modules.
I want to add a package with some files to the path after that use. import module. How can do that?