How to run two versions of Python (2, 3, 4) in the same script - the current state of solving backwards compatability issues.

Solving backwards compatability issues

This brief article summarizes the current ways to solve backwards (or forewards) compatability issues in Python, using some simple methods, harder methods, and unconventional methods. You'll find the following approaches:

  • Re-writing the module
  • Using a translation package
  • Running multiple versions locally (python2/python3) and using argv's
  • Embedding a dynamic "context placer" to the module to do regular calls
  • Compiling the program using PyInstaller with a dynamic context placer
  • Using Your Data Place

Re-writing the module with compatability issues

This one will sound annoying, but if you're a developer with a little old Python 2 code you want to use in a Python 3 project, the easiest option (if possible) is re-writing said code. You cannot simply invoke some Python 2 module without some work - so if you can avoid that work or do less work, that helps! One simple way of re-writing the module (sort of) would be to use the __future__ package to make some pre-written functions operate with Python 3. Try running your Python 2 package in Python 3, and add the relevant functions yielding problems using the followimg import examples:

from __future__ import print_function
from __future__ import unicode_literals
from __future__ import division

See the __future__ documentation for more fixes.

Using a translation package (Python 2 specifically)

If you're trying to run a Python 2 package/module in Python 3, one way of trying to import it into the raw Python 3 runtime is to use a translation package. 

The advantages to this are that it can be a quick-fix to a larger incompatible codebase, but may not work because of these reasons. See this page for a demonstration on how to do this.

Having multiple versions installed locally, and using command-line arguments to call the module

To be able to invoke (for arguments sake) Python 2 from Python 3, one can convert their Python 2 library into a library which can be invoked from the command-line, using arguments to do functions (argv) and execute, or return data through the standard output of that program. For a code example, I've found this Stack Overflow answer, and for another method (using callable protocols), see this Stack Overflow answer

The advantages to this method is you can quickly repurpose a large old Python 2 library by writing a "helper" script which will import it, and writing a basic string interpreter to call certain functions with certain data when invoked using the command-line. 

The drawbacks to this are that you require two local instances under different namespaces (e.g. python2 and python3, or py2 or py3), and are constrained to string-interpretable data-types. If you want to pass in a class as an argumen to your Python 2 script, you will have to break that class down into a representation (like the JSON of the instances __dict__) and use that -- but you won't get raw dependency injection (from what I understand at least, commenters feel free to add).

os.system("python2 email") # calls email automation

Embedding a context placer into the argument parser

This is a natural extension to the above-discussed ability to write a string parser for the arguments of a program. This way of adding backwards compatability logic will give you a more "Pythonic" feel for module interactions with this old incompatible code. This idea takes the string parser concept, and has you create a generic invokation set of code (which creates a argument for the Python 2 program), and a generic interpreter which can take that argument from the invoker. What this means, is you could do something like this: 

import no_worries

no_worries
.ydp.email(
   "me@jackhales.com",
   "Welcome to YDP!",
   "Thanks for following and reading my blog :)"
)

How would that be possible? Well, I've not proved it to be yet. But! That's something I'm working on for YDP. My thoughts are you'd implement a __getattr__ in a class (say no_worries) which is recursive to build out a way to get that call stack, like ["no_worries", "ydp", "email"]. Then, once you call this line, a __call__ can pickup this intent and run through the parameters above as well as the arguments provided to the "call".

The only part I haven't worked out in practice (but in theory) is interpreting this loose data on the unsupported module end. I'll update this post after I've developed more of this feature.

Compiling this dynamic context placer into an application

This part is not much different to the above point, but instead of using a raw Python project + module, you could compile your "helper" into an application of its own using PyInstaller. By doing this, you could theoretically embed an entire application into an executable, then use the above mentioned dynamic context placer inside this helper executable, and interact with as many modules as you want. This then lowers the overhead of importing an entire incompatable old project if that's your desire. 

Using Your Data Place to do this all for you

All the above options are great quick to more hands-on approaches to working with backwards compatability, but the easiest option in my wonderfully biased view would be to use Your Data Place. Using this, you're able to upload your incompatible codebase in question, setup the environment, and invoke it from the cloud or using our Python SDK which utilizes some of the above mentioned toolchains and more. 

Thanks 

Thanks for reading my post on approaches to resolving backwards compatability problems in Python. If you liked my writings, follow this blog and get updates on Your Data Place, and everything else I'm wokring on. My aim over time has landed on making complex data and functions simple, and I hope YDP is a big stepping stone in that aim.

Under this paragraph (three marks) are some notes I put together over a few days to track my development on some of these ideas. It may help you in understanding why this is an interesting problem I wanted to add to Your Data Place.

---

Work out how to import modules under one version, and have them invokable from another Python version. This doesn't seem possible, but it is worth looking into, and potentially writing an article describing why it isn't possible if that's the case.

https://stackoverflow.com/questions/44280909/enable-one-script-to-be-run-by-multiple-python-versions 

https://python-future.org/translation.html

https://stackoverflow.com/questions/27863832/calling-python-2-script-from-python-3

from __future__ import print_function
from __future__ import unicode_literals
from __future__ import division

May have to ask people to have them work in different areas.

Issues to solve when importing a whole new project into an area (PIP has solved this issue):

  • Differing dependencies.
Is it possible to compile them into binaries, and invoke them using a CLI? That would be WILD. For instance, writing a smart CLI which allows for a lot of dynamic input - we could compile the "common SDK" into a EXE which can be invoked from the command line and would include an interpreter underneath. That is a WILD idea... And working something basic into this would really help people get an idea for how they could do this.

For instance, we can limit the options that people have for the ability to inputting to what can be represented in a JSON format. If someone wants the ability to have a class be passed with its instance data, they will need to transform it into a string and have that class be on the other side.

This is a crazy idea, but it not only solves a tiny issue (compataiblity), but also the ability to much more easily import massive modules into the project in order to make them easily callable.

This is a WILD idea, which I want to develop further. Questions:

  • Can you import generic ML libraries as EXE's for quick reuseability?
  • How can this get messy?
  • How will filesystems be utilized?
  • Run Pyinstaller on the root file using the venv/Python, and it's locked away... Simple as that.
  • PyInstaller works on all platforms.
  • Read through this on more ideas https://realpython.com/pyinstaller-python/
  • The common library will then get the EXE and work from there
  • This can work on dataset functions as well as project functions
  • http://jsonpickle.github.io/ and https://stackoverflow.com/questions/3768895/how-to-make-a-class-json-serializable for passing in classes as an idea!
  • For complex data types, the user will have to conver them into other types, but one great option is using jsonpickle which converts objects into JSON data structs for one-way use
  • Grab self.__dict__, as Python objects aren't that special


Comments

Popular posts from this blog

The Petabyte Project, the Most Valuable Idea

Shareable data and code in YDP Projects

Siesta of YDP