This is based on my current (emphasis on current!) thoughts about what's best for code.
Dive in!
- remove support for class and static methods. I believe it is neater for classes to be just prototypes or schemas for objects. So where would class and static methods go? They would go into the module :). For example,
# rectange.py - home of all things rectangular!
# This counts all the rectangles that have been created so far. This would have been a class variable
# You might think, ooh, global variable -> bad and yes, you would be correct but it is the same as having it defined
# in the class because classes are global variables too. And of course, both versions are not threadsafe
RECTANGLE_COUNT = 0
@dataclasses
class Rectangle:
length: float
breadth: float
def __post_init__(self):
RECTANGLE_COUNT += 1
# This would have been a static function, but I think it is neater to have it outside the class.
# It belongs to the "world" of rectangles (which is what rectangle.py is) and doesn't need to be part of rectangle objects.
def scale(rectangle: Rectangle, multiplier: float) -> Rectangle:
return Rectangle(length=rectangle.length*multipier, breadth=rectangle.breath*multiplier)
- I would make everything private by default instead of the current, public by default. I believe encapsulation is a very very important principle that should thought about when building complex systems. It means you can explicitly say with guarantees what is internal and what is exposed externally and be able to change internal stuff without ever having to worry that someone is using them ever. I know python has this whole leading underscore convention to indicate private stuff but I am not a conventions person, I would rather have guarantees.
# There are a couple of ways this could be done. One way is a public keyword
public a: int = 124
public def func(a):
return a
# another mechanism might be hacking type hints (maybe better to call them type annotations or something else because with this public thing they would be doing more than just annotating the type)
a: public[int] = 122
def func(a: int) -> public[int]: # this feels weird, returning a private variable but the function itself is not public?
return a
- pass self automatically to all the methods defined in a class. Feels very redundant to me to have to pass in self everywhere. Especially if we go with what I said earlier, that they are no class or static methods on classes. You would still refer to the object with self.
class Service:
data: str
def get():
return self.data
def put(new_data: str):
self.data = new_data
- a @test decorator - I hate that python tests are just functions with names prefixed with
test
. I would rather do something like the stuff below. p.s thanks to Kevlin to demoing something like this in one of his talks.
# something like this
@test('ensure that the blah is blah'):
assert func('abc') == 'xyz'
# the above is not valid python the closest is.
@test('this does this):
def _(): # it would be even better if this line could be replaced with say namespace or something
assert func('abc') == 'xyz'
# OR maybe do it like this
@annotate('This does this')
def test(): # every test function would be called test? that's okay?
assert func('abc') == 'xyz'
# in the case, where, test are grouped into test suites, what should happen?
class FuncTestSuite:
@annotate('This does this')
def test():
assert func('abc') == 'xyz'
@annotate('This does this')
def test():
assert func('abc') == 'xyz'
# Actually, I might be able to implement this, right now. Using metaclasses etc
# Another thing, I don't really like calling this a class. Classes always screams object prototype to me.
# Would prefer a version with say a test keyword.. so
test FuncTestSuite:
func: Callable = None
def setUp():
self.func = func
@annotate('This does this')
def test():
assert self.func('abc') == 'xyz'
- I would love to be able to declare new keywords e.g test keyword introduced in the previous point or the public one (or whatever stuff doesn't have standard semantics).
- I would rely less on magic methods like
__less__
and__add__
and instead have hooks into the interpreter for stuff like this and by using dispatching
# this is in the rational_number.py module - the world of rational numbers!
@dataclass
class RationalNumber(Number):
numerator: int
denominator: int
def __post_init__(self):
g = gcd(self.numerator, self.denominator)
self.numerator /= g
self.denominator /= g
# this is more natural than a __add__ method on a class because the addition operation is all about the operation
@operator.add.register
def add(a: RationalNumber, b: RationalNumber) -> RationalNumber: # the type hints need to be required here or it would be able to add anything
return RationalNumber(numerator=a+b, denominator=a*b)
# this means we can do
a: RationalNumber = RationalNumber(2, 3) + RationalNumber(4, 5)
- I would add support for frozen objects - I believe the attrs project already has this. The more we can ensure is immutable the better.
So thoughts? Agree, disagree? What would you add, remove, or modify in this list? Add comments below.
Really great list. A bunch of thoughts running through my head:
I don't know how I feel about this. My major issue with this would be readability. I picture a file that would look like this:
Now, this can scale into, say, 5 classes with a whole bunch of class/static methods. I pretty much think this would be less readable than having each method(class/static) under the class declaration.
I'm pretty much saying "I want to be able to see everything about a class under the class. Not outside it."
Keywords as much as I understand them are builtins. Builtins are tied to the interpreter. Would creating a new keyword not mean, we're essentially attempting to update the interpreter (base language) from a project point of view? I can imagine all sorts of things going wrong with that.
Would it be wrong to think decorators (or even custom context managers) should take care of the
new keyword
tweak. If I can create a project wise decorator (or context manager), do I really still need to be able to create custom keywords?