Interfaces in Python, multiple inheritance vs. a home-made solution

I am writing a Python framework. In order to ensure a class has some properties, I make base "interface" classes like:

class BananaContainer:
    def __init__(self):
        self._bananas = []
    def bananas(self):
        return self._bananas

Then, if an object is supposed to be a container of bananas, I just derive it from BananaContainer.

A potential problem arise with objects being multiple containers. In my team we are asking ourselves whether multiple inheritance is the appropriate way to go or if some alternative solutions exist.

  • is it correct to have an object inheriting both from BananaContainer and AppleContainer for example ?
    • can it hit us back later with problems like method resolution order or name collisions ? (we can be careful not having similar properties like .name for example - we would like to limit those interfaces to the minimum required, nothing more)

A colleague is proposing a "capabilities" system based on composition instead of multiple inheritance. See below for an example of what is proposed:

class BaseCapabilities:
    EXISTING_CAPABILITIES = {"banana": BananaContainer, "apple": AppleContainer, ...}

    def get_capability(self, name):
        if name in self.capabilities():
            return EXISTING_CAPABILITIES[name](self)
    def capabilities(self):
        # should return a list of capabilities
        raise NotImplementedError

Each object would have to derive from 1 class only, but then it needs to implement the capabilities machinery:

 class MyContainer(BaseCapabilities):
     def capabilities(self):
         return ["banana", "apple"]

Then, within the framework it is possibile to check if an object has a desired capability and to get an instance of the container class... This is to avoid multiple inheritance considered harmful.

My preferred choice would be to go for multiple inheritance and to use isinstance(obj, BananaContainer) for example to know if obj is a banana container. But I understand my colleague arguments too.

I would be very grateful to get some help to make a decision on this question.


Using Python's multiple inheritance (MI) for interfaces, abstract base classes, mixins, or similar techniques is perfectly fine. In most cases, the MRO produces intuitive results.

However, object initialization under multiple inheritance is really tricky. In Python you cannot combine multiple classes per MI unless all participating classes have been designed for MI. The issue is that the __init__() method cannot know from which class it will be called and what the signature of the super().__init__() method will be. Effectively, this means that MI constructors:

  • must call the super().__init__()
  • must only take arguments by name, not by position
  • must forward any **kwargs to the super().__init__()
  • must not warn on unexpected arguments

Where possible, the better alternative is to avoid __init__() methods for interface-like classes, and instead express requirements through abstract methods. For example, instead of a BananaContainer class, we might write this interface/ABC:

import abc  # abstract base class

class BananaContainer(abc.ABC):
  def bananas(self) -> list:
    raise NotImplementedError

If a class wants to be a BananaContainer, it would have to implement that property.

In general, it is perfectly alright if you have a class that inherits from multiple interfaces or mixins. Aside from name collisions, the above __init__() problems, and general API bloat of the class, no noteworthy issues arise.

The second part of your question proposes a capability-based approach instead of using inheritance. Using composition instead of inheritance is often a very very good idea. For example, you eliminate the initialization problems by design. It also tends to lead to more explicit APIs that are easier to navigate and avoid name clashes. There should be some method that either returns an object representing a capability, or None if the capability isn't supported.

But these capabilities can be implemented in different ways: either by using normal composition, or by storing the capabilities in your own data structures.

  • Unless you have special needs for the object model, stick to the language. Store methods in normal object fields, provide normal methods to access them. This leads to a more comfortable API, and is more likely to support auto-completer and type-checkers.

  • If you need to modify the available capabilities of an object at run-time, and need to introduce new kinds of capabilities at run-time, then using a dictionary may be appropriate. But at this point you are inventing your own object system. This may be a good idea e.g. in games that have complex capability systems where new capabilities shall be defined in configuration files.

    Most software does not have these requirements, and does not benefit from that kind of flexibility.

    Additionally, Python's built-in object system is flexible enough that you could create new types and new methods without having to create a new object system. Builtins like getattr(), setattr(), hasattr(), and the type() constructor come in handy here.

I would likely express an object that can have both AppleContainer and BananaContainer capabilities like this:

class BananaContainer:

class AppleContainer:

class HasCapabilities:
  def __init__(self, x, y, z):
    # somehow determine the appropriate capabilities and initialize them
    self._banana_container = BananaContainer(y) if x else None
    self._apple_container = AppleContainer(y)

  def as_banana_container(self) -> Optional[BananaContainer]:
    return self._banana_container

  def as_apple_container(self) -> Optional[AppleContainer]:
    return self._apple_container

o = HasCapabilities(...)
bc = o.as_banana_container
if bc is not None:

Or with Python 3.8 assignment expressions:

if (bc := o.as_banana_container) is not None:

If you want to have some custom mechanisms for reflection over capabilities, you can implement that on top of this solution, with some amount of boilerplate. If we want to be MI-safe, we might declare the following base class that all capability-having classes need to inherit:

class CapabilityReflection:
  # a base implementations so that actual implementations
  # can safely call super()._get_capabilities()
  def _list_capabilities(self):
    return ()

  def all_capabilities(self):
    """deduplicated set of capabilities that this object supports."""

  def get_capability(self, captype):
    """find a capability by its type. Returns None if not supported."""
    return None

which in the above case would have been implemented as:

class HasCapabilities(CapabilityReflection):
  def _list_capabilities(self):
    caps = [  # go through properties in case they have been overridden
    yield from (cap for cap in caps if cap is not None)
    yield from super()._list_capabilities()

  def get_capability(self, captype):
    if captype == BananaContainer:
      return self.as_banana_container
    if captype == AppleContainer:
      return self.as_apple_container
    return super().get_capability(captype)
Interfaces in Python, multiple inheritance vs. a home-made solution
See more ...