welcome back everyone I'm James Murphy Python 3.12 has just released here's
my take on all the latest and greatest my company is mCoding we do software consulting check us out at mcoding.io fans of static typing will be pleased to
know about the biggest change in Python 3.12 the new syntax for type
parameters and the type statement when defining a generic function in Python you can now declare type variables
for the function in brackets previously you had to import TypeVar and create your own type variable making sure the name that you assign
it to matches the name in quotes that's no longer necessary just put brackets T you could also put multiple
type variables like this TypeVarTuples like this and ParamSpecs like this you could also constrain the
allowed types like this or like this this syntax is even more convenient
for defining generic classes previously defining generic
classes was very confusing you have to import stuff and
define your own type variables and then you have to learn about category theory and pick the right type of
variance for your type variable that used to be super annoying and confusing but
thanks to python 3.12 it's now no longer necessary you don't even have to know
about covariance anymore python just does the right thing
it automatically determines it you can now also use the type keyword
to define your own type aliases you could sort of already do this by declaring a
type alias like this but this usage is deprecated and notice that this type alias is
defined before Expected is defined that's fine these type
aliases are evaluated lazily and they can even be generic too go wild this new syntax is going to
make generics in Python a lot less cumbersome to write so if you love
static typing I rate this 8 out of 10 if you hate static typing
I'll still rate it 3 out of 10 next up PEP 701 f-strings are now officially
formally part of the Python grammar for historical reasons they used to
be implemented in a kind of hacky way one of the consequences of that hack is that
you couldn't use the same kind of quotes that started the f-string inside one of the
f-string expressions-- without escaping python would see that quote and think you're
trying to end the f-string in an unbalanced way well no more you can now put whatever expression you want inside an f-string
including another f-string but of course just because you can doesn't
mean you should-- keep things readable all around this is a great update that gets
rid of an ugly wart that python had 9 out of 10 next up PEP 684 a unique per interpreter GIL did you know you can have
multiple python interpreters in the same process-- even in the same thread here an interpreter means the part of python that runs in a loop reading and
executing python bytecode the main purpose of a sub
interpreter is to provide isolation each sub interpreter is like a
little sandbox that can execute code independently from the other interpreters it has its own globals, imports, etc. an example usage is something like mod_wsgi for security reasons a web server serving multiple different applications should keep
data for each application separate therefore mod_wsgi creates a sub
interpreter for each application group when a request comes in a worker will acquire and use the sub interpreter
dedicated to that application group so where does PEP 648 come in? well
python has this thing called the GIL the global interpreter lock but that also means python is limited
in multi-threaded performance because only the thread that currently holds
the GIL can actually run python code while sub interpreters do provide isolation there's still a small amount of
state that's shared between them and in particular the GIL is--
or WAS shared between them this PEP makes it so each interpreter gets its own GIL which will allow True
multi-threaded performance in Python for the very small number of projects
that actually use sub interpreters in summary a per interpreter GIL is a great idea it just only applies to a
few projects so 6 out of 10 have you ever tried debugging your python
application and found it to be unbearably slow compared to when you just run it normally introducing pep 669 low impact monitoring well here's a direct quote from the pep by using quickening we expect that code run under a debugger on 3.12 should outperform
code run without a debugger on 3.11 previously python debuggers worked using
sys.settrace which allows essentially the same functionality in a less efficient way the new sys.monitoring namespace provides
a clean API to register for events and the implementation details allow
it to take advantage of the ongoing work of specializing instructions at runtime "quickening" as it was mentioned
before is the process of replacing slow instructions with faster
specialized versions at runtime this is all part of the faster
CPython project link below anyway better debugging I'm
definitely in 8 out of 10 next up better error messages
particularly for common typos did you forget a self dot somewhere
or use the wrong syntax for an import now python will tell you
exactly what you did wrong these are small changes but
literally every developer benefits from better error messages 10 out of 10 ahh yes PEP 683 Immortal
objects-- this is a juicy one do you run a Django app with over
2 billion monthly active users if so this is the most exciting
python 3.12 update for you you see if you have two billion users
you're going to need to fork a lot of worker processes to handle all
those requests they generate good thing forking is so fast on
Linux because it uses copy on write but wouldn't it just be a darn shame if
every read in Python was also a write oh wait every read in Python is
also a write because when you read a python object first
you increase its ref count that's a write too bad that's going to trash
your cache and ruin your day introducing Immortal objects twiddle one of those high bits in an
object's ref count and poof it's now Immortal Immortal objects do not participate
in ref counting because they're meant to live forever-- at least
until interpreter shut down there's no point in keeping track of
how many people have references to None, True, False, and other global immutable built-ins make them Immortal too with Immortals now you can have
a read that's really just a read 2 out of 10 I don't have 2
billion monthly active users next up PEP 709 comprehension inlining previously dictionary list
and set comprehensions like these were defined under the hood using functions meaning it literally compiles the inside
of a comprehension as a separate function which it then instantiates and calls immediately that creates a bit of overhead
because you now have to create this function object and create a
stack frame when you call the function well that's no more they
change the implementation, dictionary list and set comprehensions
no longer use functions under the hood all the comprehensions get compiled
inline inside this function hopefully you won't really notice this change you'll get a slightly different traceback if
you have an exception inside a comprehension because there's no longer this stack
frame associated with the comprehension and since comprehensions are now
compiled inline you can now see the local variables of the outer
function with a call to locals as opposed to the local variables
of that inner lambda function scope I mean it's a tiny Improvement
but why not 6 out of 10 next up PEP 688 using the
buffer protocol from python the buffer protocol is a common interface
allowing many libraries like numpy to interoperate as long as they can
conceptualize their data storage as some kind of n-dimensional array determined by a few properties like the number of
dimensions shape strides and element size previously the buffer interface
was only accessible in the C API this pep exposes that interface
to python by adding two new magic dunder methods __buffer__ and __release_buffer__ the usefulness of buffers in Python
is limited but it could be useful for debugging or potentially building a
bridge type that connects two C extensions I probably won't use it 4 out of 10 next up PEP 692 using TypedDict to annotate kwargs the default way that you annotate keyword
arguments in Python is not very intuitive kwargs is always going to be
a dictionary and the keys of the the dictionary are always going to be strings so originally python decided that that means
you just need to annotate the value type so if I annotate kwargs as an int this means
that it's a dictionary of string to int that's already confusing enough
but also when would I want just an arbitrary number of extra keyword
arguments that all have to be integers? if you have a use case I'd love to
hear your thoughts comment below if I was going to take a bunch of keyword arguments I'd probably want them
to have some specific structure a typed dictionary seems like a
perfect match for defining the structure of a dictionary so why don't we use that we can't just type hint kwargs
as a movie because remember that's going to be interpreted as
a dictionary of strings to movies well in 3.12 they added Unpack
into the typing library which instructs the type checker to do
what you would hope it would do namely treat kwargs like it's supposed
to be one of these typed dictionaries I'll give it 4 out of 10 I guess I would
really prefer to do this instead of using kwargs but kwargs are part of the language so
we should at least be able to type them properly next up developers who are paid by the
line of code are going to love this one PEP 698 the typing override decorator let's say you have a class hierarchy we have a base class shape
that has a calc_area method then we have squares and
triangles that are both shapes both Square and Triangle have their own
specialized way of calculating their area the programmer then uses the override
decorator to signal that yes this implementation of calc_area is meant to be
an override of something in a base class this doesn't really do much at runtime but
when we run our type checker we find a problem the method used to be called get_area but we
did a big refactor and it's now called calc_area but it looks like we missed a spot and
there's still a get_area floating around the base class may have even
had a correct albeit slower implementation of calc_area for the triangle
so potentially all of our tests still pass but because this method is marked override
and there's no get_area function left in the parent class this now becomes a type error
that we can catch at static analysis time there's also a suggested strict mode
that would make it a type error if you didn't mark the function as override when
it was overriding something in a base class this could potentially stop you from
accidentally overriding something in a base class that you didn't
know existed in the first place my only fear is this might be the second
step along the way to becoming C++ with their "public static constexpr auto const override"
that you need to slap on every definition hopefully it won't come to that 5 out of 10 next up PEP 689 introducing
the Unstable C API tier lots about the C API in this release, huh? this pep introduces the unstable API which has the
same compatibility requirements as the public API but does not require deprecation
warnings before change so this contains things that are basically
expected to change with every version of python this would be useful for something like a debugger that might use specific op codes that are
only valid in specific versions of python currently everything the unstable
tier is related to code objects so unless you're using the C API this probably
won't affect you but good to be aware 6 out of 10 and the most painful change not
even a pep for this one setuptools is no longer included by default in a virtualenv this change is undoubtedly going
to break some people's builds setuptools has been the default and
de-facto standard for building and packaging your python code for a long time now and many people generally expect it to be there with that said setuptools is no
longer the only build system for Python and it doesn't make sense to pre-install it pip still works even if setuptools is
not installed but it's still painful to break someone's build so
I'll give it a 1 out of 10 but ultimately I do agree with the pip
team that this needed to be done eventually there's also a ton of tiny improvements to many
libraries but I'm not going through all those let me just mention my favorite
gem-- itertools.batched, 10 out of 10 I write batched over and over again it's in like 80% of my codebases glad to
see it included in the standard library all it does is split an iterable into
batches of a fixed size in this case 3 so I get 1 2 3, 4 5 6, except
the last one might be shorter and finally saving the best for last the absolute
most awesome part of python 3.12 being released is that python 3.7 is now past its end of life and
I can drop support for it in all of my code bases and python 3.8 has a year to go so I can
start pushing clients to upgrade to 3.9 again I'm James Murphy don't forget to check
out my consulting services at mcoding.io thanks to my patrons and donors see you in the next one