Selection @pythonetc, September 2018


    This is the fourth selection of tips about Python and programming from my author’s @pythonetc channel .


    Previous selections:



    Override and Overload


    There are two concepts that are easily confused: overriding and overloading.


    Overriding happens when a child class defines a method already provided by the parent classes, and thereby replaces it. In some languages, it is required to explicitly mark the overriding method (the modifier is used in C # override), and in some languages ​​this is done at will (annotation @Overridein Java). Python does not require the use of a special modifier and does not provide for standard tagging of such methods (someone for the sake of readability uses a custom decorator @overridewho does not do anything).


    Overloading is another story. This term refers to the situation when there are several functions with the same name, but with different signatures. Overloading is possible in Java and C ++, it is often used to provide default arguments:


    classFoo {
        publicstaticvoidmain(String[] args) {
            System.out.println(Hello());
        }
        publicstaticStringHello() {
            returnHello("world");
        }
        publicstaticStringHello(String name) {
            return "Hello, " + name;
        }
    }

    Python does not support searching functions by signature, only by name. Of course, you can write code that explicitly analyzes the types and number of arguments, but it will look awkward, and this practice is best avoided:


    defquadrilateral_area(*args):if len(args) == 4:
            quadrilateral = Quadrilateral(*args)
        elif len(args) == 1:
            quadrilateral = args[0]
        else:
            raise TypeError()
        return quadrilateral.area()

    If you need type hints, use the typingdecorator module @overload:


    from typing import overload
    @overloaddefquadrilateral_area(
        q: Quadrilateral
    ) -> float: ...
    @overloaddefquadrilateral_area(
        p1: Point, p2: Point,
        p3: Point, p4: Point
    ) -> float: ...

    Autovivification


    collections.defaultdictallows you to create a dictionary that returns the default value if the requested key is missing (instead of discarding KeyError). To create, defaultdictyou need to provide not just a default value, but a factory of such values.


    So you can create a dictionary with a virtually infinite number of nested dictionaries, which allows you to use constructs like d[a][b][c]...[z].


    >>> definfinite_dict():... return defaultdict(infinite_dict)
    ...
    >>> d = infinite_dict()
    >>> d[1][2][3][4] = 10>>> dict(d[1][2][3][5])
    {}

    This behavior is called “autovivification,” a term that comes from Perl.


    Instantiation


    Instantiating objects involves two important steps. First __new__, a method is called from the class that creates and returns a new object. Then, from it, Python calls the method __init__that sets the initial state of this object.


    However, __init__it will not be called if it __new__returns an object that is not an instance of the original class. In this case, the object could have been created by another class, and therefore it was __init__already called on the object:


    classFoo:def__new__(cls, x):return dict(x=x)
        def__init__(self, x):
            print(x)  # Never called
    print(Foo(0))

    This also means that you should not instantiate the same class in __new__using the regular constructor ( Foo(...)). This can lead to re-execution __init__, or even endless recursion.


    Infinite recursion:


    classFoo:def__new__(cls, x):return Foo(-x)  # Recursion

    Double execution __init__:


    classFoo:def__new__(cls, x):if x < 0:
                return Foo(-x)
            return super().__new__(cls)
        def__init__(self, x):
            print(x)
            self._x = x

    The right way:


    classFoo:def__new__(cls, x):if x < 0:
                return cls.__new__(cls, -x)
            return super().__new__(cls)
        def__init__(self, x):
            print(x)
            self._x = x

    [] Operator and slices


    In Python, you can override an operator []by defining a magic method __getitem__. For example, you can create an object that virtually contains an infinite number of duplicate elements:


    classCycle:def__init__(self, lst):
            self._lst = lst
        def__getitem__(self, index):return self._lst[
                index % len(self._lst)
            ]
    print(Cycle(['a', 'b', 'c'])[100])  # 'b'

    The unusual thing here is that the statement []supports a unique syntax. With it you can get not only [2], but also [2:10], [2:10:2], [2::2]and even [:]. The operator semantics is: [start: stop: step], however you can use it in any other way to create custom objects.


    But if you call using this syntax __getitem__, what does it get as an index parameter? That is what slice objects are for.


    In : classInspector:
    ...:     def__getitem__(self, index):
    ...:         print(index)
    ...:
    In : Inspector()[1]
    1
    In : Inspector()[1:2]
    slice(1, 2, None)
    In : Inspector()[1:2:3]
    slice(1, 2, 3)
    In : Inspector()[:]
    slice(None, None, None)

    You can even combine syntaxes of tuples and slices:


    In : Inspector()[:, 0, :]
    (slice(None, None, None), 0, slice(None, None, None))

    slicedoes nothing, only stores attributes start, stopand step.


    In : s = slice(1, 2, 3)
    In : s.start
    Out: 1
    In : s.stop
    Out: 2
    In : s.step
    Out: 3

    Asyncio interruption


    Any coroutine played asynciocan be interrupted using the method cancel(). In this case, Korutina will be sent CancelledError, as a result, this and all associated Korutins will be interrupted until the error is caught and suppressed.


    CancelledError- a subclass Exception, which means you can accidentally catch it using a combination try ... except Exceptiondesigned to catch "any mistakes." To safely catch a coroutine, you have to do this:


    try:
        await action()
    except asyncio.CancelledError:
        raiseexcept Exception:
        logging.exception('action failed')

    Execution planning


    To schedule the execution of some code at a certain time asyncio, a task is usually created, which performs await asyncio.sleep(x):


    import asyncio
    asyncdefdo(n=0):
        print(n)
        await asyncio.sleep(1)
        loop.create_task(do(n + 1))
        loop.create_task(do(n + 1))
    loop = asyncio.get_event_loop()
    loop.create_task(do())
    loop.run_forever()

    But creating a new task can be expensive, and you don’t have to do it if you don’t plan to perform asynchronous operations (like the function doin my example). Instead, you can use functions loop.call_laterand loop.call_atthat allow you to schedule an asynchronous callback call:


    import asyncio                     
    defdo(n=0):                       
        print(n)                       
        loop = asyncio.get_event_loop()
        loop.call_later(1, do, n+1)    
        loop.call_later(1, do, n+1)    
    loop = asyncio.get_event_loop()    
    do()                               
    loop.run_forever()

    Also popular now: