id
int64
0
25.6k
text
stringlengths
0
4.59k
1,700
finally run traceback (most recent call last)file "except-finally py"line in func(file "except-finally py"line in raise def raise ()raise syntaxerror syntaxerrornone as we saw in as of python except and finally clauses can be mixed in the same try statement thisalong with multiple except clause supportmakes some of the syntactic nesting described in this section unnecessarythough the equivalent runtime nesting is common in larger python programs moreoversyntactic nesting still works todaymay still appear in code written prior to python that you may encountercan make the disjoint roles of except and finally more explicitand can be used as technique for implementing alternative exception-handling behaviors in general exception idioms we've seen the mechanics behind exceptions now let' take look at some of the other ways they are typically used breaking out of multiple nested loops"go toas mentioned at the start of this part of the bookexceptions can often be used to serve the same roles as other languages"go tostatements to implement more arbitrary control transfers exceptionshoweverprovide more structured option that localizes the jump to specific block of nested code in this roleraise is like "go to,and except clauses and exception names take the place of program labels you can jump only out of code wrapped in try this waybut that' crucial feature--truly arbitrary "go tostatements can make code extraordinarily difficult to understand and maintain for examplepython' break statement exits just the single closest enclosing loopbut we can always use exceptions to break out of more than one loop level if neededclass exitloop(exception)pass trywhile truewhile truefor in range( )if raise exitloop print('loop %siprint('loop 'print('loop 'except exitloopprint('continuing'break exits just one level or just passto move on exception idioms
1,701
loop loop loop loop continuing if you change the raise in this to breakyou'll get an infinite loopbecause you'll break only out of the most deeply nested for loopand wind up in the second-level loop nesting the code would then print "loop and start the for again also notice that variable is still what it was after the try statement exits variable assignments made in try are not undone in generalthough as we've seenexception instance variables listed in except clause headers are localized to that clauseand the local variables of any functions that are exited as result of raise are discarded technicallyactive functionslocal variables are popped off the call stack and the objects they reference may be garbage-collected as resultbut this is an automatic step exceptions aren' always errors in pythonall errors are exceptionsbut not all exceptions are errors for instancewe saw in that file object read methods return an empty string at the end of file in contrastthe built-in input function--which we first met in deployed in an interactive loop in and learned is named raw_input in --reads line of text from the standard input streamsys stdinat each call and raises the builtin eoferror at end-of-file unlike file methodsthis function does not return an empty string--an empty string from input means an empty line despite its namethoughthe eoferror exception is just signal in this contextnot an error because of this behaviorunless the end-offile should terminate scriptinput often appears wrapped in try handler and nested in loopas in the following codewhile truetryline input(read line from stdin (raw_input in xexcept eoferrorbreak exit loop at end-of-file elseprocess next line here several other built-in exceptions are similarly signalsnot errors--for examplecalling sys exit(and pressing ctrl- on your keyboard raise systemexit and keyboardinter ruptrespectively python also has set of built-in exceptions that represent warnings rather than errorssome of these are used to signal use of deprecated (phased outlanguage features see the standard library manual' description of built-in exceptions for more information designing with exceptions
1,702
warnings functions can signal conditions with raise user-defined exceptions can also signal nonerror conditions for instancea search routine can be coded to raise an exception when match is found instead of returning status flag for the caller to interpret in the followingthe try/except/else exception handler does the work of an if/else return-value testerclass found(exception)pass def searcher()if success raise found(elsereturn raise exceptions instead of returning flags trysearcher(except foundsuccess elsefailure exception if item was found else returnednot found more generallysuch coding structure may also be useful for any function that cannot return sentinel value to designate success or failure in widely applicable functionfor instanceif all objects are potentially valid return valuesit' impossible for any return value to signal failure condition exceptions provide way to signal results without return valueclass failure(exception)pass def searcher()if success return founditem elseraise failure(tryitem searcher(except failurenot found elseuse item here because python is dynamically typed and polymorphic to the coreexceptionsrather than sentinel return valuesare the generally preferred way to signal such conditions exception idioms
1,703
we encountered examples in this category in as summarythoughexception processing tools are also commonly used to ensure that system resources are finalizedregardless of whether an error occurs during processing or not for examplesome servers require connections to be closed in order to terminate session similarlyoutput files may require close calls to flush their buffers to disk for waiting consumersand input files may consume file descriptors if not closedalthough file objects are automatically closed when garbage-collected if still openin some pythons it may be difficult to be sure when that will occur as we saw in the most general and explicit way to guarantee termination actions for specific block of code is the try/finally statementmyfile open( ' :\code\textdata'' 'tryprocess myfile finallymyfile close(as we also sawsome objects make this potentially easier in python and later by providing context managers that terminate or close the objects for us automatically when run by the with/as statementwith open( ' :\code\textdata'' 'as myfileprocess myfile so which option is better hereas usualit depends on your programs compared to the traditional try/finallycontext managers are more implicitwhich runs contrary to python' general design philosophy context managers are also arguably less general --they are available only for select objectsand writing user-defined context managers to handle general termination requirements is more complex than coding try/finally on the other handusing existing context managers requires less code than using tryfinallyas shown by the preceding examples moreoverthe context manager protocol supports entry actions in addition to exit actions in factit can save line of code when no exceptions are expected at all (albeit at the expense of further nesting and indenting file processing logic)myfile open(filename' 'process myfile myfile close(traditional form with open(filenameas myfileprocess myfile context manager form stillthe implicit exception processing of with makes it more directly comparable to the explicit exception handling of try/finally although try/finally is the more widely applicable techniquecontext managers may be preferable where they are already availableor where their extra complexity is warranted designing with exceptions
1,704
you can also make use of exception handlers to replace python' default top-level exception-handling behavior by wrapping an entire program (or call to itin an outer try in your top-level codeyou can catch any exception that may occur while your program runsthereby subverting the default program termination in the followingthe empty except clause catches any uncaught exception raised while the program runs to get hold of the actual exception that occurred in this modefetch the sys exc_info function call result from the built-in sys moduleit returns tuple whose first two items contain the current exception' class and the instance object raised (more on sys exc_info in moment)tryrun program exceptall uncaught exceptions come here import sys print('uncaught!'sys exc_info()[ ]sys exc_info()[ ]this structure is commonly used during developmentto keep programs active even after errors occur--within loopit allows you to run additional tests without having to restart it' also used when testing other program codeas described in the next section on related notefor more about handling program shutdowns without recovery from themsee also python' atexit standard library module it' also possible to customize what the top-level exception handler does with sys excepthook these and other related tools are described in python' library manual running in-process tests some of the coding patterns we've just looked at can be combined in test-driver application that tests other code within the same process the following partial code sketches the general modelimport sys log open('testlog'' 'from testapi import moretestsrunnexttesttestname def testdriver()while moretests()tryrunnexttest(exceptprint('failed'testname()sys exc_info()[: ]file=logelseprint('passed'testname()file=logtestdriver(exception idioms
1,705
is left abstract in this examplebecause an uncaught exception in test case would normally kill this test driveryou need to wrap test case calls in try if you want to continue the testing process after test fails the empty except catches any uncaught exception generated by test case as usualand it uses sys exc_info to log the exception to file the else clause is run when no exception occurs--the test success case such boilerplate code is typical of systems that test functionsmodulesand classes by running them in the same process as the test driver in practicehowevertesting can be much more sophisticated than this for instanceto test external programsyou could instead check status codes or outputs generated by program-launching tools such as os system and os popenused earlier in this book and covered in the standard library manual such tools do not generally raise exceptions for errors in the external programs --in factthe test cases may run in parallel with the test driver at the end of this we'll also briefly meet more complete testing frameworks provided by pythonsuch as doctest and pyunitwhich provide tools for comparing expected outputs with actual results more on sys exc_info the sys exc_info result used in the last two sections allows an exception handler to gain access to the most recently raised exception generically this is especially useful when using the empty except clause to catch everything blindlyto determine what was raisedtryexceptsys exc_info()[ : are the exception class and instance if no exception is being handledthis call returns tuple containing three none values otherwisethe values returned are (typevaluetraceback)wheretype is the exception class of the exception being handled value is the exception class instance that was raised traceback is traceback object that represents the call stack at the point where the exception originally occurredand used by the traceback module to generate error messages as we saw in the prior sys exc_info can also sometimes be useful to determine the specific exception type when catching exception category superclasses as we've also learnedthoughbecause in this case you can also get the exception type by fetching the __class__ attribute of the instance obtained with the as clausesys exc_info is often redundant apart from the empty excepttry designing with exceptions
1,706
instance __class__ is the exception class as we've seenusing exception for the general exception name here would catch all nonexit exceptionssimilar to an empty except but less extremeand still giving access to the exception instance and its class even sousing the instance object' interfaces and polymorphism is often better approach than testing exception types--exception methods can be defined per class and run genericallytryexcept general as instanceinstance method(does the right thing for this instance as usualbeing too specific in python can limit your code' flexibility polymorphic approach like the last example here generally supports future evolution better than explicitly type-specific tests or actions displaying errors and tracebacks finallythe exception traceback object available in the prior section' sys exc_info result is also used by the standard library' traceback module to generate the standard error message and stack display manually this file has handful of interfaces that support wide customizationwhich we don' have space to cover usefully herebut the basics are simple consider the following aptly named filebadly pyimport traceback def inverse( )return tryinverse( except exceptiontraceback print_exc(file=open('badly exc'' ')print('bye'this code uses the print_exc convenience function in the traceback modulewhich uses sys exc_info data by defaultwhen runthe script prints the error message to file--handy in testing programs that need to catch errors but still record them in fullc:\codepython badly py bye :\codetype badly exc traceback (most recent call last)file "badly py"line in inverse( file "badly py"line in inverse return zerodivisionerrordivision by zero exception idioms
1,707
topicsconsult other reference resources and manuals version skew notein python xthe older tools sys exc_type and sys exc_value still work to fetch the most recent exception type and valuebut they can manage only singleglobal exception for the entire process these two names have been removed in python the newer and preferred sys exc_info(call available in both and instead keeps track of each thread' exception informationand so is threadspecific of coursethis distinction matters only when using multiple threads in python programs ( subject beyond this book' scope)but forces the issue see other resources for more details exception design tips and gotchas ' lumping design tips and gotchas together in this because it turns out that the most common gotchas largely stem from design issues by and largeexceptions are easy to use in python the real art behind them is in deciding how specific or general your except clauses should be and how much code to wrap up in try statements let' address the second of these concerns first what should be wrapped in principleyou could wrap every statement in your script in its own trybut that would just be silly (the try statements would then need to be wrapped in try statements!what to wrap is really design issue that goes beyond the language itselfand it will become more apparent with use but for nowhere are few rules of thumboperations that commonly fail should generally be wrapped in try statements for exampleoperations that interface with system state (file openssocket callsand the likeare prime candidates for try howeverthere are exceptions to the prior rule--in simple scriptyou may want failures of such operations to kill your program instead of being caught and ignored this is especially true if the failure is showstopper failures in python typically result in useful error messages (not hard crashes)and this is the best outcome some programs could hope for you should implement termination actions in try/finally statements to guarantee their executionunless context manager is available as with/as option the tryfinally statement form allows you to run code whether exceptions occur or not in arbitrary scenarios it is sometimes more convenient to wrap the call to large function in single try statementrather than littering the function itself with many try statements designing with exceptions
1,708
and you reduce the amount of code within the function the types of programs you write will probably influence the amount of exception handling you code as well serversfor instancemust generally keep running persistently and so will likely require try statements to catch and recover from exceptions inprocess testing programs of the kind we saw in this will probably handle exceptions as well simpler one-shot scriptsthoughwill often ignore exception handling completely because failure at any step requires script shutdown catching too muchavoid empty except and exception as mentionedexception handler generality is key design choice python lets you pick and choose which exceptions to catchbut you sometimes have to be careful to not be too inclusive for exampleyou've seen that an empty except clause catches every exception that might be raised while the code in the try block runs that' easy to codeand sometimes desirablebut you may also wind up intercepting an error that' expected by try handler higher up in the exception nesting structure for examplean exception handler such as the following catches and stops every exception that reaches itregardless of whether another handler is waiting for itdef func()tryexcepttryfunc(except indexerrorindexerror is raised in here but everything comes here and diesexception should be processed here perhaps worsesuch code might also catch unrelated system exceptions even things like memory errorsgenuine programming mistakesiteration stopskeyboard interruptsand system exits raise exceptions in python unless you're writing debugger or similar toolsuch exceptions should not usually be intercepted in your code for examplescripts normally exit when control falls off the end of the top-level file howeverpython also provides built-in sys exit(statuscodecall to allow early terminations this actually works by raising built-in systemexit exception to end the programso that try/finally handlers run on the way out and special types of programs can intercept the event because of thisa try with an empty except might unknowingly prevent crucial exitas in the following file (exiter py) related callos _exitalso ends programbut via an immediate termination--it skips cleanup actionsincluding any registered with the atexit module noted earlierand cannot be intercepted with tryexcept or try/finally blocks it is usually used only in spawned child processesa topic beyond this book' scope see the library manual or follow-up texts for details exception design tips and gotchas
1,709
def bye()sys exit( trybye(exceptprint('got it'print('continuing 'crucial errorabort nowoops--we ignored the exit python exiter py got it continuing you simply might not expect all the kinds of exceptions that could occur during an operation using the built-in exception classes of the prior can help in this particular casebecause the exception superclass is not superclass of systemexittrybye(except exceptionwon' catch exitsbut _will_ catch many others in other casesthoughthis scheme is no better than an empty except clause--because exception is superclass above all built-in exceptions except system-exit eventsit still has the potential to catch exceptions meant for elsewhere in the program probably worst of allboth using an empty except and catching the exception superclass will also catch genuine programming errorswhich should be allowed to pass most of the time in factthese two techniques can effectively turn off python' error-reporting machinerymaking it difficult to notice mistakes in your code consider this codefor examplemydictionary tryx myditctionary['spam'exceptx none continue here with oopsmisspelled assume we got keyerror the coder here assumes that the only sort of error that can happen when indexing dictionary is missing key error but because the name myditctionary is misspelled (it should say mydictionary)python raises nameerror instead for the undefined name referencewhich the handler will silently catch and ignore the event handler will incorrectly fill in none default for the dictionary accessmasking the program error moreovercatching exception here will not help--it would have the exact same effect as an empty excepthappily and silently filling in default and masking genuine program error you will probably want to know about if this happens in code that is far removed from the place where the fetched values are usedit might make for very interesting debugging task designing with exceptions
1,710
and exception catchers are handybut potentially error-prone in the last examplefor instanceyou would be better off saying except keyerrorto make your intentions explicit and avoid intercepting unrelated events in simpler scriptsthe potential for problems might not be significant enough to outweigh the convenience of catchallbut in generalgeneral handlers are generally trouble catching too littleuse class-based categories on the other handneither should handlers be too specific when you list specific exceptions in tryyou catch only what you actually list this isn' necessarily bad thingbut if system evolves to raise other exceptions in the futureyou may need to go back and add them to exception lists elsewhere in your code we saw this phenomenon at work in the prior for instancethe following handler is written to treat myexcept and myexcept as normal cases and everything else as an error if you add myexcept in the futurethoughit will be processed as an error unless you update the exception listtryexcept (myexcept myexcept )elsebreaks if you add myexcept later nonerrors assumed to be an error luckilycareful use of the class-based exceptions we discussed in can make this code maintenance trap go away completely as we sawif you catch general superclassyou can add and raise more specific subclasses in the future without having to extend except clause lists manually--the superclass becomes an extendible exceptions categorytryexcept successcategorynameelseok if you add myexcept subclass later nonerrors assumed to be an error in other wordsa little design goes long way the moral of the story is to be careful to be neither too general nor too specific in exception handlersand to pick the granularity of your try statement wrappings wisely especially in larger systemsexception policies should be part of the overall design core language summary congratulationsthis concludes your look at the fundamentals of the python programming language if you've gotten this faryou've become fully operational python core language summary
1,711
describe in moment in terms of the essentialsthoughthe python story--and this book' main journey--is now complete along the wayyou've seen just about everything there is to see in the language itselfand in enough depth to apply to most of the code you are likely to encounter in the open source "wild you've studied built-in typesstatementsand exceptionsas well as tools used to build up the larger program units of functionsmodulesand classes you've also explored important software design issuesthe complete oop paradigmfunctional programing toolsprogram architecture conceptsalternative tool tradeoffsand more--compiling skill set now qualified to be turned loose on the task of developing real applications the python toolset from this point forwardyour future python career will largely consist of becoming proficient with the toolset available for application-level python programming you'll find this to be an ongoing task the standard libraryfor examplecontains hundreds of modulesand the public domain offers still more tools it' possible to spend decades seeking proficiency with all these toolsespecially as new ones are constantly appearing to address new technologies (trust me on this-- ' at years and counting!speaking generallypython provides hierarchy of toolsetsbuilt-ins built-in types like stringslistsand dictionaries make it easy to write simple programs fast python extensions for more demanding tasksyou can extend python by writing your own functionsmodulesand classes compiled extensions although we don' cover this topic in this bookpython can also be extended with modules written in an external language like or +because python layers its toolsetsyou can decide how deeply your programs need to delve into this hierarchy for any given task--you can use built-ins for simple scriptsadd python-coded extensions for larger systemsand code compiled extensions for advanced work we've only covered the first two of these categories in this bookand that' plenty to get you started doing substantial programming in python beyond thisthere are toolsresourcesor precedents for using python in nearly any computer domain you can imagine for pointers on where to go nextsee ' overview of python applications and users you'll likely find that with powerful open source language like pythoncommon tasks are often much easierand even enjoyablethan you might expect designing with exceptions
1,712
most of the examples in this book have been fairly small and self-contained they were written that way on purposeto help you master the basics but now that you know all about the core languageit' time to start learning how to use python' built-in and third-party interfaces to do real work in practicepython programs can become substantially larger than the examples you've experimented with so far in this book even in pythonthousands of lines of code are not uncommon for nontrivial and useful programsonce you add up all the individual modules in the system though python basic program structuring tools such as modules and classes help much to manage this complexityother tools can sometimes offer additional support for developing larger systemsyou'll find such support available in both python and the public domain you've seen some of these in actionand 've mentioned few others to help you on your next stepshere is quick tour and summary of some of the most commonly used tools in this domainpydoc and docstrings pydoc' help function and html interfaces were introduced in pydoc provides documentation system for your modules and objectsintegrates with python' docstrings syntaxand is standard part of the python system see and for more documentation source hints pychecker and pylint because python is such dynamic languagesome programming errors are not reported until your program runs (even syntax errors are not caught until file is run or importedthis isn' big drawback--as with most languagesit just means that you have to test your python code before shipping it at worstwith python you essentially trade compile phase for an initial testing phase furthermorepython' dynamic natureautomatic error messagesand exception model make it easier and quicker to find and fix errors than it is in some other languages unlike cfor examplepython does not crash completely on errors stilltools can help here too the pychecker and pylint systems provide support for catching common errors ahead of timebefore your script runs they serve similar roles to the lint program in development some python developers run their code through pychecker prior to testing or deliveryto catch any lurking potential problems in factit' not bad idea to try this when you're first starting out--some of these toolswarnings may help you learn to spot and avoid common python mistakes pychecker and pylint are third-party open source packagesavailable at the pypi website or your friendly neighborhood web search engine they may appear in ide guis as well pyunit ( unittestin we learned how to add self-test code to python file by using the __name__ ='__main__trick at the bottom of the file-- simple unit-testing procore language summary
1,713
tools the firstpyunit (called unittest in the library manual)provides an objectoriented class framework for specifying and customizing test cases and expected results it mimics the junit framework for java this is sophisticated class-based unit testing systemsee the python library manual for details doctest the doctest standard library module provides second and simpler approach to regression testingbased upon python' docstrings feature roughlyto use doct estyou cut and paste log of an interactive testing session into the docstrings of your source files doctest then extracts your docstringsparses out the test cases and resultsand reruns the tests to verify the expected results doctest' operation can be tailored in variety of wayssee the library manual for more details ides we discussed ides for python in ides such as idle provide graphical environment for editingrunningdebuggingand browsing your python programs some advanced ides--such as eclipsekomodonetbeansand others listed in --may support additional development tasksincluding source control integrationcode refactoringproject management toolsand more see the text editors page at search engine for more on available ides and gui builders for python profilers because python is so high-level and dynamicintuitions about performance gleaned from experience with other languages usually don' apply to python code to truly isolate performance bottlenecks in your codeyou need to add timing logic with clock tools in the time or timeit modulesor run your code under the pro file module we saw an example of the timing modules at work when comparing the speed of iteration tools and pythons in profiling is usually your first optimization step--code for claritythen profile to isolate bottlenecksand then time alternative codings of the slow parts of your program for the second of these stepsprofile is standard library module that implements source code profiler for python it runs string of code you provide ( script file importor call to functionand thenby defaultprints report to the standard output stream that gives performance statistics--number of calls to each functiontime spent in each functionand more the profile module can be run as script or importedand it may be customized in various waysfor exampleit can save run statistics to file to be analyzed later with the pstats module to profile interactivelyimport the profile module and call profile run('code')passing in the code you wish to profile as string ( call to functionan import of fileor code read from fileto profile from system shell command lineuse command of the form python - profile main py args (see appendix for more on this formatalso see python' standard library manuals for other profiling optionsthe cprofile modulefor examplehas designing with exceptions
1,714
to profiling long-running programs debuggers we also discussed debugging options in (see its sidebar "debugging python codeon page as reviewmost development ides for python support gui-based debuggingand the python standard library also includes source code debugger module called pdb this module provides command-line interface and works much like common language debuggers ( dbxgdbmuch like the profilerthe pdb debugger can be run either interactively or from command line and can be imported and called from python program to use it interactivelyimport the modulestart running code by calling pdb function ( pdb run('main()'))and then type debugging commands from pdb' interactive prompt to launch pdb from system shell command lineuse command of the form python - pdb main py args pdb also includes useful postmortem analysis callpdb pm()which starts the debugger after an exception has been encounteredpossibly in conjunction with python' - flag see appendix for more on these tools because ides such as idle also include point-and-click debugging interfacespdb isn' as critical tool todayexcept when gui isn' available or when more control is desired see for tips on using idle' debugging gui interfaces reallyneither pdb nor ides seem to be used much in practice--as noted in most programmers either insert print statements or simply read python' error messagesperhaps not the most high-tech of approachesbut the practical tends to win the day in the python worldshipping options in we introduced common tools for packaging python programs py exepyinstallerand others listed in that can package byte code and the python virtual machine into "frozen binarystandalone executableswhich don' require that python be installed on the target machine and hide your system' code in additionwe learned in that python programs may be shipped in their source pyor byte code pycformsand that import hooks support special packaging techniques such as automatic extraction of zip files and byte code encryption we also briefly met the standard library' distutils moduleswhich provide packaging options for python modules and packagesand -coded extensionssee the python manuals for more details the emerging python "eggsthird-party packaging system provides another alternative that also accounts for dependenciessearch the web for more details optimization options when speed countsthere are handful of options for optimizing your programs the pypy system described in provides just-in-time compiler for translating python byte code to binary machine codeand shed skin offers python-tocore language summary
1,715
and to be deployed in because this provides very modest performance boosthoweverit is not commonly used except to remove debugging code as last resortyou can also move parts of your program to compiled language such as to boost performance see the book programming python and the python standard manuals for more on extensions in generalpython' speed tends to also improve over timeso upgrading to later releases may improve speed too-once you verify that they are faster for your codethat is (though largely repaired sincepython ' initial release was up to slower than on some io operations!other hints for larger projects we've met variety of core language features in this text that will also tend to become more useful once you start coding larger projects these include module packages ()class-based exceptions ()class pseudoprivate attributes ()documentation strings ()module path configuration files ()hiding names from from with __all__ lists and _xstyle names ()adding self-test code with the __name__ ='__main__trick ()using common design rules for functions and modules (and )using object-oriented design patterns and others)and so on to learn about other large-scale python development tools available in the public domainbe sure to browse the pages at the pypi website at the web at large applying python is actually larger topic than learning pythonand one we'll have to delegate to follow-up resources here summary this wrapped up the exceptions part of the book with survey of design conceptsa look at common exception use casesand brief summary of commonly used development tools this also wrapped up the core material of this book at this pointyou've been exposed to the full subset of python that most programmers use--and probably more in factif you have read this faryou should feel free to consider yourself an official python programmer be sure to pick up -shirt or laptop sticker the next time you're online (and don' forget to add python to your resume the next time you dig it outthe next and final part of this book is collection of dealing with topics that are advancedbut still in the core language category these are all optional readingor at least deferrable readingbecause not every python programmer must delve into their subjectsand others can postpone these topics until they are needed designing with exceptions
1,716
than advanced--and to someesoteric--language features on the other handif you do need to care about things like unicode or binary datahave to deal with api-building tools such as descriptorsdecoratorsand metaclassesor just want to dig bit further in generalthe next part of the book will help you get started the larger examples in the final part will also give you chance to see the concepts you've already learned being applied in more realistic ways as this is the end of the core material of this bookthoughyou get break on the quiz--just one question this time as alwaysbe sure to work through this part' closing exercises to cement what you've learned in the past few because the next part is optional readingthis is the final end-of-part exercises session if you want to see some examples of how what you've learned comes together in real scripts drawn from common applicationsbe sure to check out the "solutionto exercise in appendix and if this is the end of your journey in this bookbe sure to also see the "bonussection at the end of the very last in this book (for the sake of readers continuing on to the advanced topics parti won' spill the beans heretest your knowledgequiz (this question is repeat from the first quiz in --seei told you it would be easy:-why does "spamshow up in so many python examples in books and on the webtest your knowledgeanswers because python is named after the british comedy group monty python (based on surveys 've conducted in classesthis is much-too-well-kept secret in the python world!the spam reference comes from monty python skitset in cafeteria whose menu items all seem to come with spam couple trying to order food there keeps getting drowned out by chorus of vikings singing song about spam noreally and if could insert an audio clip of that song herei would test your knowledgepart vii exercises as we've reached the end of this part of the bookit' time for few exception exercises to give you chance to practice the basics exceptions really are simple toolsif you get theseyou've probably mastered the exceptions domain see part vii in appendix for the solutions test your knowledgepart vii exercises
1,717
statement to catch the error what happens if you change oops to raise keyerror instead of an indexerrorwhere do the names keyerror and indexerror come from(hintrecall that all unqualified names generally come from one of four scopes exception objects and lists change the oops function you just wrote to raise an exception you define yourselfcalled myerror identify your exception with class (unless you're using python or earlieryou mustthenextend the try statement in the catcher function to catch this exception and its instance in addition to indexerrorand print the instance you catch error handling write function called safe(func*pargs**kargsthat runs any function with any number of positional and/or keyword arguments by using the arbitrary arguments header and call syntaxcatches any exception raised while the function runsand prints the exception using the exc_info call in the sys module then use your safe function to run your oops function from exercise or put safe in module file called exctools pyand pass it the oops function interactively what kind of error messages do you getfinallyexpand safe to also print python stack trace when an error occurs by calling the built-in print_exc function in the standard traceback modulesee earlier in this and consult the python library reference manual for usage details we could probably code safe as function decorator using techniquesbut we'll have to move on to the next part of the book to learn fully how (see the solutions for preview self-study examples at the end of appendix di've included handful of example scripts developed as group exercises in live python classes for you to study and run on your own in conjunction with python' standard manual set these are not describedand they use tools in the python standard library that you'll have to research on your own stillfor many readersit helps to see how the concepts we've discussed in this book come together in real programs if these whet your appetite for moreyou can find wealth of larger and more realistic applicationlevel python program examples in follow-up books like programming python and on the web designing with exceptions
1,718
advanced topics
1,719
unicode and byte strings so farour exploration of strings in this book has been deliberately incomplete ' types preview briefly introduced python' unicode strings and files without giving many detailsand the strings in the core types part of this book (deliberately limited its scope to the subset of string topics that most python programmers need to know about this was by designbecause many programmersincluding most beginnersdeal with simple forms of text like asciithey can happily work with python' basic str string type and its associated operations and don' need to come to grips with more advanced string concepts in factsuch programmers can often ignore the string changes in python and continue to use strings as they may have in the past on the other handmany other programmers deal with more specialized types of datanon-ascii character setsimage file contentsand so on for those programmersand others who may someday join themin this we're going to fill in the rest of the python string story and look at some more advanced concepts in python' string model specificallywe'll explore the basics of python' support for unicode text--rich character strings used in internationalized applications--as well as binary data--strings that represent absolute byte values as we'll seethe advanced string representation story has diverged in recent versions of pythonpython provides an alternative string type for binary dataand supports unicode text (including asciiin its normal string type python provides an alternative string type for non-ascii unicode textand supports both simple text and binary data in its normal string type in additionbecause python' string model has direct impact on how you process non-ascii fileswe'll explore the fundamentals of that related topic here as well finallywe'll take brief look at some advanced string and binary toolssuch as pattern matchingobject picklingbinary data packingand xml parsingand the ways in which they are impacted by ' string changes
1,720
delve into the worlds of unicode encodings or binary data for some readers' preview may sufficeand others may wish to file this away for future reference if you ever need to care about processing either of thesethoughyou'll find that python' string models provide the support you need string changes in one of the most noticeable changes in the python line is the mutation of string object types in nutshell ' str and unicode types have morphed into ' bytes and str typesand new mutable bytearray type has been added the bytear ray type is technically available in python and too (though not earlier)but it' back-port from and does not as clearly distinguish between text and binary content in especially if you process data that is either unicode or binary in naturethese changes can have substantial impacts on your code as general rule of thumbhow much you need to care about this topic depends in large part upon which of the following categories you fall intoif you deal with non-ascii unicode text--for instancein the context of internationalized domains like the webor the results of some xml and json parsers and databases--you will find support for text encodings to be different in xbut also probably more directaccessibleand seamless than in if you deal with binary data--for examplein the form of image or audio files or packed data processed with the struct module--you will need to understand ' new bytes object and ' different and sharper distinction between text and binary data and files if you fall into neither of the prior two categoriesyou can generally use strings in much as you would in xwith the general str string typetext filesand all the familiar string operations we studied earlier your strings will be encoded and decoded by using your platform' default encoding ( asciior utf- on windows in the --sys getdefaultencoding gives your default if you care to check)but you probably won' notice in other wordsif your text is always asciiyou can get by with normal string objects and text files and can avoid most of the following story for now as we'll see in momentascii is simple kind of unicode and subset of other encodingsso string operations and files generally "just workif your programs process only ascii text even if you fall into the last of the three categories just mentionedthougha basic understanding of unicode and ' string model can help both to demystify some of the underlying behavior nowand to make mastering unicode or binary data issues easier if they impact you later unicode and byte strings
1,721
filesdirectoriesnetwork interfacesdatabasespipesjsonxmland even guisunicode may no longer be an optional topic for you in python python ' support for unicode and binary data is also available in xalbeit in different forms although our main focus in this is on string types in xwe'll also explore how ' equivalent support differs along the way for readers using regardless of which version you usethe tools we'll explore here can become important in many types of programs string basics before we look at any codelet' begin with general overview of python' string model to understand why changed the way it did on this frontwe have to start with brief look at how characters are actually represented in computers--both when encoded in files and when stored in memory character encoding schemes most programmers think of strings as series of characters used to represent textual data while that' accuratethe way characters are stored can varydepending on what sort of character set must be recorded when text is stored on filesfor exampleits character set determines its format character sets are standards that assign integer codes to individual characters so they can be represented in computer memory the ascii standardfor examplewas created in the and it defines many programmersnotion of text strings ascii defines character codes from through and allows each character to be stored in one bit byteonly bits of which are actually used for examplethe ascii standard maps the character 'ato the integer value ( in hex)which can be stored in single byte in memory and files if you wish to see how this workspython' ord built-in function gives the binary identifying value for characterand chr returns the character for given integer code valueord(' ' hex( ' chr( ' 'ais byte with binary value in ascii (and othersbinary value stands for character 'asometimes one byte per character isn' enoughthough various symbols and accented charactersfor instancedo not fit into the range of possible characters defined by ascii to accommodate special characterssome standards use all the possible values string basics
1,722
through (outside ascii' rangeto special characters one such standardknown as the latin- character setis widely used in western europe in latin- character codes above are assigned to accented and otherwise special characters the character assigned to byte value for exampleis specially marked non-ascii character xc chr( 'apython result form shown this standard allows for wide array of extra special charactersbut still supports ascii as -bit subset of its -bit representation stillsome alphabets define so many characters that it is impossible to represent each of them as one byte unicode allows more flexibility unicode text is sometimes referred to as "wide-characterstringsbecause characters may be represented with multiple bytes if needed unicode is typically used in internationalized programsto represent europeanasianand other non-english character sets that have more characters than -bit bytes can represent to store such rich text in computer memorywe say that characters are translated to and from raw bytes using an encoding--the rules for translating string of unicode characters to sequence of bytesand extracting string from sequence of bytes more procedurallythis translation back and forth between bytes and strings is defined by two termsencoding is the process of translating string of characters into its raw bytes formaccording to desired encoding name decoding is the process of translating raw string of bytes into its character string formaccording to its encoding name that iswe encode from string to raw bytesand decode from raw bytes to string to scriptsdecoded strings are just characters in memorybut may be encoded into variety of byte string representations when stored on filestransferred over networksembedded in documents and databasesand so on for some encodingsthe translation process is trivial--ascii and latin- for instancemap each character to fixed-size single byteso no translation work is required for other encodingsthe mapping can be more complex and yield multiple bytes per charactereven for simple -bit forms of text the widely used utf- encodingfor exampleallows wide range of characters to be represented by employing variable-sized number of bytes scheme character codes less than are represented as single bytecodes between and ff ( are turned into byteswhere each byte has value between and and codes above ff are turned into or -byte sequences having values between and this unicode and byte strings
1,723
(zero valuebytes that can cause problems for libraries and networking because their encodingscharacter maps assign characters to the same codes for compatibilityascii is subset of both latin- and utf- that isa valid ascii character string is also valid latin- and utf- -encoded string for exampleevery ascii file is valid utf- filebecause the ascii character set is -bit subset of utf- converselythe utf- encoding is binary compatible with asciibut only for character codes less than latin- and utf- simply allow for additional characterslatin- for characters mapped to values through within byteand utf- for characters that may be represented with multiple bytes other encodings allow for richer character sets in different ways utf- and utf- for exampleformat text with fixed-size and bytes per each character schemerespectivelyeven for characters that could otherwise fit in single byte some encodings may also insert prefixes that identify byte ordering to see this for yourselfrun string' encode methodwhich gives its encoded bytestring format under named scheme-- two-character ascii string is bytes in asciilatin- and utf- but it' much wider in utf- and utf- and includes header bytess 'nis encode('ascii') encode('latin ') encode('utf '( 'ni' 'ni' 'ni' encode('utf ')len( encode('utf ')( '\xff\xfen\ \ ' encode('utf ')len( encode('utf ')( '\xff\xfe\ \ \ \ \ \ \ \ ' these results differ slightly in python (you won' get the leading for byte stringsbut all of these encoding schemes--asciilatin- utf- and many others--are considered to be unicode to python programmersencodings are specified as strings containing the encoding' name python comes with roughly different encodingssee the python library reference for complete list importing the module encodings and running help(encod ingsshows you many encoding names as wellsome are implemented in pythonand some in some encodings have multiple namestoofor examplelatin- iso_ and are all synonyms for the same encodinglatin- we'll revisit encodings later in this when we study techniques for writing unicode strings in script for more on the underlying unicode storysee the python standard manual set it includes "unicode howtoin its "python howtossectionwhich provides additional background that we will skip here in the interest of space string basics
1,724
how python stores strings in memory the prior section' encodings really only apply when text is stored or transferred externallyin files and other mediums in memorypython always stores decoded text strings in an encoding-neutral formatwhich may or may not use multiple bytes for each character all text processing occurs in this uniform internal format text is translated to and from an encoding-specific format only when it is transferred to or from external text filesbyte stringsor apis with specific encoding requirements once in memorythoughstrings have no encoding they are just the string object presented in this book though irrelevant to your codeit may help some readers to make this more tangible the way python actually stores text in memory is prone to change over timeand in fact mutated substantially as of python and earlier through python strings are stored internally in fixed-length utf- (roughlyucs- format with bytes per characterunless python is configured to use bytes per character (ucs- python and later python and later instead use variable-length scheme with or bytes per characterdepending on string' content the size is chosen based upon the character with the largest unicode ordinal value in the represented string this scheme allows space-efficient representation in common casesbut also allows for full ucs- on all platforms python ' new scheme is an optimizationespecially compared to former wide unicode builds per python documentationmemory footprint is divided by to depending on the textencoding an ascii string to utf- doesn' need to encode characters anymorebecause its ascii and utf- representations are the samerepeating single ascii letter and getting substring of an ascii strings is times fasterutf- is to times fasterand utf- encoding is up to times faster on some benchmarkspython ' overall memory usage is to times smaller than and similar to the less unicode-centric regardless of the storage scheme usedas noted in unicode clearly requires us to think of strings in terms of charactersinstead of bytes this may be bigger hurdle for programmers accustomed to the simpler ascii-only world where each character mapped to single bytebut that idea no longer appliesin terms of both the results of text string tools and physical character sizetext tools todayboth string content and length really correspond to unicode code points-identifying ordinal numbers for characters for instancethe built-in ord function now returns character' unicode code point ordinalwhich is not necessarily an ascii codeand which may or may not fit in single -bit byte' value similarly unicode and byte strings
1,725
memoryand its characters may not fit in bytes anyhow text size as we saw by example in under unicode single character does not necessarily map directly to single byteeither when encoded in file or when stored in memory even characters in simple -bit ascii text may not map to bytes --utf- uses multiple bytes per character in filesand python may allocate or bytes per character in memory thinking in terms of characters allows us to abstract away the details of external and internal storage the key point herethoughis that encoding pertains mostly to files and transfers once loaded into python stringtext in memory has no notion of an "encoding,and is simply sequence of unicode characters ( code pointsstored generically in your scriptthat string is accessed as python string object--the next section' topic python' string types at more concrete levelthe python language provides string data types to represent character text in your scripts the string types you will use in your scripts depend upon the version of python you're using python has general string type for representing binary data and simple -bit text like asciialong with specific type for representing richer unicode textstr for representing -bit text and binary data unicode for representing decoded unicode text python ' two string types are different (unicode allows for the extra size of some unicode characters and has extra support for encoding and decoding)but their operation sets largely overlap the str string type in is used for text that can be represented with -bit bytes (including ascii and latin- )as well as binary data that represents absolute byte values by contrastpython comes with three string object types--one for textual data and two for binary datastr for representing decoded unicode text (including asciibytes for representing binary data (including encoded textbytearraya mutable flavor of the bytes type as mentioned earlierbytearray is also available in python and but it' simply back-port from with less content-specific behavior and is generally considered type string basics
1,726
all three string types in support similar operation setsbut they have different roles the main goal behind this change in was to merge the normal and unicode string types of into single string type that supports both simple and unicode textdevelopers wanted to remove the string dichotomy and make unicode processing more natural given that ascii and other -bit text is really simple kind of unicodethis convergence seems logically sound to achieve this stores text in redefined str type--an immutable sequence of characters (not necessarily bytes)which may contain either simple text such as ascii whose character values fit in single bytesor richer character set text such as utf- whose character values may require multiple bytes strings processed by your script with this type are stored generically in memoryand are encoded to and decoded from byte strings per either the platform unicode default or an explicit encoding name this allows scripts to translate text to different encoding schemesboth in memory and when transferring to and from files while ' new str type does achieve the desired string/unicode mergingmany programs still need to process raw binary data that is not encoded per any text format image and audio filesas well as packed data used to interface with devices or programs you might process with python' struct modulefall into this category because unicode strings are decoded from bytesthey cannot be used to represent bytes to support processing of such truly binary dataa new string typebytesalso was introduced--an immutable sequence of -bit integers representing absolute byte valueswhich prints as ascii characters when possible though distinct object typebytes supports almost all the same operations that the str type doesthis includes string methodssequence operationsand even re module pattern matchingbut not string formatting in xthe general str type fills this binary data rolebecause its strings are just sequences of bytesthe separate unicode type handles richer text strings in more detaila bytes object really is sequence of small integerseach of which is in the range through indexing bytes returns an intslicing one returns another bytesand running the list built-in on one returns list of integersnot characters when processed with operations that assume charactersthoughthe contents of bytes objects are assumed to be ascii-encoded bytes ( the isalpha method assumes each byte is an ascii character codefurtherbytes objects are printed as character strings instead of integers for convenience while they were at itpython developers also added bytearray type in bytear ray is variant of bytes that is mutable and so supports in-place changes it supports the usual string operations that str and bytes doas well as many of the same in-place change operations as lists ( the append and extend methodsand assignment to indexesthis can be useful both for truly binary data and simple types of text assuming your text strings can be treated as raw -bit bytes ( ascii or latin- text)bytearray finally adds direct in-place mutability for text data--something not possible unicode and byte strings
1,727
str or bytes although python and offer much the same functionalitythey package it differently in factthe mapping from to string types is not completely direct- ' str equates to both str and bytes in xand ' str equates to both str and unicode in moreoverthe mutability of ' bytearray is unique in practicethoughthis asymmetry is not as daunting as it might sound it boils down to the followingin xyou will use str for simple text and binary data and unicode for advanced forms of text whose character sets don' map to -bit bytesin xyou'll use str for any kind of text (asciilatin- and all other kinds of unicodeand bytes or bytearray for binary data in practicethe choice is often made for you by the tools you use--especially in the case of file processing toolsthe topic of the next section text and binary files file / (input and outputwas also revamped in to reflect the str/bytes distinction and automatically support encoding unicode text on transfers python now makes sharp platform-independent distinction between text files and binary filesin xtext files when file is opened in text modereading its data automatically decodes its content and returns it as strwriting takes str and automatically encodes it before transferring it to the file both reads and writes translate per platform default or provided encoding name text-mode files also support universal end-of-line translation and additional encoding specification arguments depending on the encoding nametext files may also automatically process the byte order mark sequence at the start of file (more on this momentarilybinary files when file is opened in binary mode by adding (lowercase onlyto the modestring argument in the built-in open callreading its data does not decode it in any way but simply returns its content raw and unchangedas bytes objectwriting similarly takes bytes object and transfers it to the file unchanged binary-mode files also accept bytearray object for the content to be written to the file because the language sharply differentiates between str and bytesyou must decide whether your data is text or binary in nature and use either str or bytes objects to represent its content in your scriptas appropriate ultimatelythe mode in which you open file will dictate which type of object your script will use to represent its contentif you are processing image filesdata transferred over networkspacked binary data whose content you must extractor some device data streamschances are good that you will want to deal with it using bytes and binary-mode files you might also opt for bytearray if you wish to update the data without making copies of it in memory string basics
1,728
outputhtmlemail contentor csv or xml filesyou'll probably want to use str and text-mode files notice that the mode string argument to built-in function open (its second argumentbecomes fairly crucial in python --its content not only specifies file processing modebut also implies python object type by adding to the mode stringyou specify binary mode and will receiveor must providea bytes object to represent the file' content when reading or writing without the byour file is processed in text modeand you'll use str objects to represent its content in your script for examplethe modes rbwband rbimply bytesrw+and rt (the defaultimply str text-mode files also handle the byte order marker (bomsequence that may appear at the start of files under some encoding schemes in the utf- and utf- encodingsfor examplethe bom specifies bigor little-endian format (essentiallywhich end of bit-string is most significant)--see the leading bytes in the results of the utf- and utf- encoding calls we ran earlier for examples utf- text file might also include bom to declare that it is utf- in general when reading and writing data using these encoding schemespython skips or writes the bom according to rules we'll study later in this in python xthe same behavior is supportedbut normal files created by open are used to access bytes-based dataand unicode files opened with the codecs open call are used to process unicode text data the latter of these also encode and decode on transferas we'll see later in this firstlet' explore python' unicode string model live coding basic strings let' step through few examples that demonstrate how the string types are used one note up frontthe code in this section was run with and applies to only stillbasic string operations are generally portable across python versions simple ascii strings represented with the str type work the same in and (and exactly as we saw in of this bookmoreoveralthough there is no bytes type in python (it has just the general str)it can usually run code that thinks there is--in and the call bytes(xis present as synonym for str( )and the new literal form bis taken to be the same as the normal string literal you may still run into version skew in some isolated casesthoughthe bytes callfor instancedoes not require or allow the second argument (encoding namethat is required by ' bytes unicode and byte strings
1,729
python string objects originate when you call built-in function such as str or bytesread file created by calling open (described in the next section)or code literal syntax in your script for the lattera new literal formb'xxx(and equivalentlyb'xxx'is used to create bytes objects in xand you may create bytearray objects by calling the bytearray functionwith variety of possible arguments more formallyin all the current string literal forms--'xxx'"xxx"and triplequoted blocks--generate stradding or just before any of them creates bytes instead this new bbytes literal is similar in form to the rraw string used to suppress backslash escapes consider the followingrun in xc:\codec:\python \python 'spams 'eggs bytes literal make bytes object ( -bit bytes str literal makes unicode text string type( )type( ( 'spams 'eggsbytessequence of intprints as character string the bytes object is actually sequence of short integersthough it prints its content as characters whenever possibleb[ ] [ indexing returns an int for bytesstr for str ( ' ' [ :] [ :slicing makes another bytes or str object ( 'pam''ggs'list( )list( ([ ][' '' '' '' ']bytes is really -bit small ints the bytes object is also immutablejust like str (though bytearraydescribed lateris not)you cannot assign strbytesor integer to an offset of bytes object [ 'xboth are immutable typeerror'bytesobject does not support item assignment [ 'xtypeerror'strobject does not support item assignment finallynote that the bytes literal' or prefix also works for any string literal formincluding triple-quoted blocksthough you get back string of raw bytes that may or may not map to charactersbytes prefix works on singledoubletriple quotesraw ""xxxx yyyy "" '\nxxxx\nyyyy\ncoding basic strings
1,730
python ' 'xxxand 'xxxunicode string literal forms were removed in python because they were deemed redundant--normal strings are unicode in to aid both forward and backward compatibilitythoughthey are available again as of where they are treated as normal str stringsc:\codec:\python \python 'spamtype(uu 'spamu[ 'slist( [' '' '' '' ' unicode literal accepted in it is just strbut is backward compatible these literals are gone in through where you must use 'xxxinstead you should generally use 'xxxtext literals in new -only codebecause the form is superfluous howeverin and laterusing the literal form can ease the task of porting codeand boost code compatibility (for case in pointsee ' currency exampledescribed in an upcoming noteregardless of how text strings are coded in xthoughthey are all unicodeeven if they contain only ascii characters (more on writing non-ascii unicode text in the section "coding non-ascii texton page python string literals all three of the string forms of the prior section can be coded in xbut their meaning differs as mentioned earlierin python and the 'xxxbytes literal is present for forward compatibility with xbut is the same as 'xxxand makes str (the is ignored)and bytes is just synonym for stras you've seenin both of these address the distinct bytes typec:\codec:\python \python 'spams 'eggs bytes literal is just str in str is bytes/character sequence type( )type( (bs ('spam''eggs' [ ] [ (' '' 'list( )list( ([' '' '' '' '][' '' '' '' ']in the special unicode literal and type accommodates richer forms of textu 'spamtype( unicode and byte strings unicode literal makes distinct type works in toobut is just str there
1,731
'spamu[ 'slist( [ ' ' ' ' ' ' ' 'as we sawfor compatibility this form works in and later toobut it simply makes normal str there (the is ignoredstring type conversions although python allowed str and unicode type objects to be mixed in expressions (when the str contained only -bit ascii text) draws much sharper distinction --str and bytes type objects never mix automatically in expressions and never are converted to one another automatically when passed to functions function that expects an argument to be str object won' generally accept bytesand vice versa because of thispython basically requires that you commit to one type or the otheror perform manualexplicit conversions when neededstr encode(and bytes(sencodingtranslate string to its raw bytes form and create an encoded bytes from decoded str in the process bytes decode(and str(bencodingtranslate raw bytes into its string form and create decoded str from an encoded bytes in the process these encode and decode methods (as well as file objectsdescribed in the next sectionuse either default encoding for your platform or an explicitly passed-in encoding name for examplein python xs 'eggss encode( 'eggsbytes(sencoding='ascii' 'eggsb 'spamb decode('spamstr(bencoding='ascii''spamstr->bytesencode text into raw bytes str->bytesalternative bytes->strdecode raw bytes into text bytes->stralternative two cautions here first of allyour platform' default encoding is available in the sys modulebut the encoding argument to bytes is not optionaleven though it is in str encode (and bytes decodesecondalthough calls to str do not require the encoding argument like bytes doesleaving it off in str calls does not mean that it defaults--insteada str call without an encoding returns the bytes object' print stringnot its str converted form (this is usually not what you'll want!assuming and are still as in the prior listingcoding basic strings
1,732
sys platform 'win sys getdefaultencoding('utf- underlying platform default encoding for str here bytes(stypeerrorstring argument without an encoding str( " 'spam'len(str( ) len(str(bencoding='ascii') str without encoding print stringnot conversionuse encoding to convert to str when in doubtpass in an encoding name argument in xeven if it may have default conversions are similar in python xthough ' support for mixing string types in expressions makes conversions optional for ascii textand the tool names differ for the different string type model--conversions in occur between encoded str and decoded unicoderather than ' encoded bytes and decoded strs 'spamu 'eggssu ('spam' 'eggs'unicode( )str( ( 'spam''eggs' decode() encode(( 'spam''eggs' type string conversion tools converts str->uniuni->str versus byte->strstr->bytes coding unicode strings encoding and decoding become more meaningful when you start dealing with nonascii unicode text to code arbitrary unicode characters in your stringssome of which you might not even be able to type on your keyboardpython string literals support both "\xnnhex byte value escapes and "\unnnnand "\unnnnnnnnunicode escapes in string literals in unicode escapesthe first form gives four hex digits to encode -byte ( -bitcharacter code pointand the second gives eight hex digits for -byte ( -bitcode point byte strings support only hex escapes for encoded text and other forms of byte-based data coding ascii text let' step through some examples that demonstrate text coding basics as we've seenascii text is simple type of unicodestored as sequence of byte values that represent charactersc:\codec:\python \python ord(' ''xis binary code point value in the default encoding unicode and byte strings
1,733
chr( ' stands for character 'xs 'xyza unicode string of ascii text 'xyzlen(sthree characters long [ord(cfor in sthree characters with integer ordinal values [ normal -bit ascii text like this is represented with one character per byte under each of the unicode encoding schemes described earlier in this encode('ascii'values in byte ( bitseach 'xyzs encode('latin- 'values in byte ( bitseach 'xyzs encode('utf- 'values in byte in others or 'xyzin factthe bytes objects returned by encoding ascii text this way are really sequence of short integerswhich just happen to print as ascii characters when possibles encode('latin- ' 'xyzs encode('latin- ')[ list( encode('latin- ')[ coding non-ascii text formallyto code non-ascii characterswe can usehex or unicode escapes to embed unicode code point ordinal values in text strings --normal string literals in xand unicode string literals in (and in for compatibilityhex escapes to embed the encoded representation of characters in byte strings-normal string literals in xand bytes string literals in (and in for compatibilitynote that text strings embed actual code point valueswhile byte strings embed their encoded form the value of character' encoded representation in byte string is the same as its decoded unicode code point value in text string for only certain characters and encodings in any eventhex escapes are limited to coding single byte' valuebut unicode escapes can name characters with values and bytes wide the chr function can also be used to create single non-ascii character from its code point valueand as we'll see latersource code declarations apply to such characters embedded in your script coding unicode strings
1,734
outside the -bit range of asciibut we can embed them in str objects because str supports unicodechr( xc 'achr( xe ' xc xe characters outside ascii' range '\xc \xe 'aesingle -bit value hex escapestwo digits '\ \ 'aelen( -bit unicode escapesfour digits each two characters long (not number of bytes!note that in unicode text string literals like thesehex and unicode escapes denote unicode code point valuenot byte values the hex escapes require exactly two digits (for -bit code point values)and and unicode escapes require exactly four and eight hexadecimal digitsrespectivelyfor denoting code point values that can be as big as and bits will allows '\ \ 'ae -bit unicode escapeseight digits each as shown laterpython works similarly in this regardbut unicode escapes are allowed only in its unicode literal form they work in normal string literals in here simply because its normal strings are always unicode encoding and decoding non-ascii text nowif we try to encode the prior section' non-ascii text string into raw bytes using as asciiwe'll get an errorbecause its characters are outside ascii' -bit code point value ranges '\ \ 'aelen( non-ascii text stringtwo characters long encode('ascii'unicodeencodeerror'asciicodec can' encode characters in position - ordinal not in range( encoding this as latin- worksthoughbecause each character falls into that encoding' -bit rangeand we get byte per character allocated in the encoded byte string encoding as utf- also worksthis encoding supports wide range of unicode code unicode and byte strings
1,735
are written to filethe raw bytes shown here for encoding results are what is actually stored on the file for the encoding types givens encode('latin- ' '\xc \xe byte per character when encoded encode('utf- ' '\xc \ \xc \xa bytes per character when encoded len( encode('latin- ') len( encode('utf- ') bytes in latin- in utf- note that you can also go the other wayreading raw bytes from file and decoding them back to unicode string howeveras we'll see laterthe encoding mode you give to the open call causes this decoding to be done for you automatically on input (and avoids issues that may arise from reading partial character sequences when reading by blocks of bytes) '\xc \xe '\xc \xe len( decode('latin- ''aeb '\xc \ \xc \xa len( decode('utf- ''aelen( decode('utf- ') text encoded per latin- raw bytestwo encoded characters decode to text per latin- text encoded per utf- raw bytestwo encoded characters decode to text per utf- two unicode characters in memory other encoding schemes some encodings use even larger byte sequences to represent characters when neededyou can specify both and -bit unicode code point values for characters in your strings--as shown earlierwe can use "\ with four hex digits for the formerand "\ with eight hex digits for the latterand can mix these in literals with simpler ascii characters freelys ' \ \ cs 'aabeclen( abcand non-ascii characters five characters long encode('latin- ' ' \xc \xe ccoding unicode strings
1,736
encode('utf- ' ' \xc \ \xc \xa clen( encode('utf- ') bytes when encoded per latin- bytes when encoded per utf- technically speakingyou can also build unicode strings piecemeal using chr instead of unicode or hex escapesbut this might become tedious for large stringss 'achr( xc 'bchr( xe 'cs 'aabecsome other encodings may use very different byte formatsthough the cp ebcdic encodingfor exampledoesn' even encode ascii the same way as the encodings we've been using so farsince python encodes and decodes for uswe only generally need to care about this when providing encoding names for data sourcess 'aabecs encode('cp ' '\xc \xc \xc encode('cp ' ' \ eb\ acs 'spams encode('latin- ' 'spams encode('utf- ' 'spams encode('cp ' '\xa \ \ \ encode('cp ' 'spamtwo other western european encodings bytes eachdifferent encoded values ascii text is the same in most but not in cp ibm ebcdicthe same holds true for the utf- and utf- encodingswhich use fixed and byte-per-character schemes with same-sized headers--non-ascii encodes differentlyand ascii is not byte per characters ' \ \ cs encode('utf- ' '\xff\xfea\ \xc \ \ \xe \ \ 'spams encode('utf- ' '\xff\xfes\ \ \ \ encode('utf- ' '\xff\xfe\ \ \ \ \ \ \ \ \ \ \ \ \ \ unicode and byte strings
1,737
two cautions here too firstpython allows special characters to be coded with both hex and unicode escapes in str stringsbut only with hex escapes in bytes strings --unicode escape sequences are silently taken verbatim in bytes literalsnot as escapes in factbytes must be decoded to str strings to print their non-ascii characters properlys ' \xc \xe cs 'aabecs ' \ \ cs 'aabec xstr recognizes hex and unicode escapes ' \xc \xe cb ' \xc \xe cb ' \ \ cb ' \\ \\ cbytes recognizes hex but not unicode ' \xc \xe cb ' \xc \xe cprint(bb' \xc \xe cb decode('latin- ''aabecuse hex escapes for bytes prints non-ascii as hex escape sequences taken literallydecode as latin- to interpret as text secondbytes literals require characters either to be ascii characters orif their values are greater than to be escapedstr stingson the other handallow literals containing any character in the source character set--whichas discussed laterdefaults to utf- unless an encoding declaration is given in the source files 'aabecs 'aabecchars from utf- if no encoding declaration 'aabecsyntaxerrorbytes can only contain ascii literal characters ' \xc \xe cb ' \xc \xe cb decode('latin- ''aabecchars must be asciior escapes encode( ' \xc \ \xc \xa cs encode('utf- ' ' \xc \ \xc \xa csource code encoded per utf- by default uses system default to encodeunless passed coding unicode strings
1,738
raw bytes do not correspond to utf- unicodedecodeerror'utf codec can' decode bytes in position - both these constraints make sense if you remember that byte strings hold bytes-based datanot decoded unicode code point ordinalswhile they may contain the encoded form of textdecoded code point values don' quite apply to byte strings unless the characters are first encoded converting encodings so farwe've been encoding and decoding strings to inspect their structure it' also possible to convert string to different encoding than its originalbut we must provide an explicit encoding name to encode to and decode from this is true whether the original text string originated in file or literal the term conversion may be misnomer here--it really just means encoding text string to raw bytes per different encoding scheme than the one it was decoded from as stressed earlierdecoded text in memory has no encoding typeand is simply string of unicode code points ( characters)there is no concept of changing its encoding in this form stillthis scheme allows scripts to read data in one encoding and store it in anotherto support multiple clients of the same datab ' \xc \ \xc \xa cs decode('utf- ' 'aabectext encoded in utf- format originally decode to unicode text per utf- encode('cp ' '\xc \xc \xc convert to encoded bytes per ebcdic decode('cp ' 'aabecconvert back to unicode per ebcdic encode( ' \xc \ \xc \xa cper default utf- encoding again keep in mind that the special unicode and hex character escapes are only necessary when you code non-ascii unicode strings manually in practiceyou'll often load such text from files instead as we'll see later in this ' file object (created with the open built-in functionautomatically decodes text strings as they are read and encodes them when they are writtenbecause of thisyour script can often deal with strings genericallywithout having to code special characters directly later in this we'll also see that it' possible to convert between encodings when transferring strings to and from filesusing technique very similar to that in the last examplealthough you'll still need to provide explicit encoding names when opening filethe file interface does most of the conversion work for you automatically unicode and byte strings
1,739
stress python unicode support in this because it' new but now that 've shown you the basics of unicode strings in xi need to explain more fully how you can do much the same in xthough the tools differ unicode is available in python xbut is distinct type from strsupports most of the same operationsand allows mixing of normal and unicode strings when the str is all ascii in factyou can essentially pretend ' str is ' bytes when it comes to decoding raw bytes into unicode stringas long as it' in the proper form here is in actionunicode characters display in hex in unless you explicitly printand non-ascii displays can vary per shell (most of this section ran outside idlewhich sometimes detects and prints latin- characters in encoded byte strings--see ahead for more on pythonioencoding and windows command prompt display issues) :\codec:\python \python ' \xc \xe cs ' \xc \xe cprint -bphc decode('latin ' ' \xc \xe cprint aabec string of -bit bytes text encoded per latin- some non-ascii nonprintable characters (idle may differdecode bytes to unicode text per latin- decode('utf- 'encoded form not compatible with utf- unicodedecodeerror'utf codec can' decode byte xc in position invalid ontinuation byte decode('ascii'encoded bytes are also outside ascii range unicodedecodeerror'asciicodec can' decode byte xc in position ordinal not in range( to code unicode textmake unicode object with the 'xxxliteral form (as mentionedthis literal is available again in but superfluous in in generalsince its normal strings support unicode) ' \xc \xe cu ' \xc \xe cprint aabec make unicode stringhex escapes once you've created ityou can convert unicode text to different raw byte encodingssimilar to encoding str objects into bytes objects in xu encode('latin- '' \xc \xe cencode per latin- -bit bytes coding unicode strings
1,740
' \xc \ \xc \xa cencode per utf- multibyte non-ascii characters can be coded with hex or unicode escapes in string literals in xjust as in howeveras with bytes in xthe "\ and "\ escapes are recognized only for unicode strings in xnot -bit str strings--againthese are used to give the values of decoded unicode ordinal integerswhich don' make sense in raw byte stringc:\codec:\python \python ' \xc \xe cu ' \xc \xe cprint aabec hex escapes for non-ascii ' \ \ cu ' \xc \xe cprint aabec unicode escapes for non-ascii ' bitsu' bits ' \xc \xe cs ' \xc \xe cprint -bphc print decode('latin- 'aabec hex escapes work ' \ \ cs ' \\ \\ cprint \ \ len( not unicode escapestaken literallybut some may print oddlyunless decoded mixing string types in like ' str and bytes ' unicode and str share nearly identical operation setsso unless you need to convert to other encodings you can often treat unicode as though it were str one of the primary differences between and xthoughis that uni code and non-unicode str objects can be freely mixed in expressions--as long as the str is compatible with the unicode objectpython will automatically convert it up to unicodeu'ab'cdu'abcdcan mix if compatible in but 'abb'cdnot allowed in howeverthis liberal approach to mixing string types in works only if the -bit string happens to contain only -bit (asciibytes unicode and byte strings
1,741
can' mix in if str is non-asciiu ' \xc \xe cs unicodedecodeerror'asciicodec can' decode byte xc in position ordinal not in range( 'abcu 'abca\xc \xe cprint 'abcu abcaabec can mix only if str is all -bit ascii decode('latin- ' ' \xc \xe ca\xc \xe cprint decode('latin- ' aabecaabec manual conversion may be required in too print '\xa ' ps also see ' currency example use print to display characters by contrastin xstr and bytes never mix automatically and require manual conversions--the preceding code actually runs in but only because ' unicode literal is taken to be the same as normal string by (the is ignored)the equivalent would be str added to bytes ( 'abb'cd'which fails in xunless objects are converted to common type in xthoughthe difference in types is often trivial to your code like normal stringsunicode strings may be concatenatedindexedslicedmatched with the re moduleand so onand they cannot be changed in place if you ever need to convert between the two types explicitlyyou can use the built-in str and unicode functions as shown earlierstr( 'spam''spamunicode('spam' 'spamunicode to normal normal to unicode if you are using python xalso watch for an example of your different file interface later in this your open call supports only files of -bit bytesreturning their contents as str stringsand it' up to you to interpret the contents as text or binary data and decode if needed to read and write unicode files and encode or decode their content automaticallyuse ' codecs open call we'll see in action later in this this call provides much the same functionality as ' open and uses unicode objects to represent file content--reading file translates encoded bytes into decoded unicode charactersand writing translates strings to the desired encoding specified when the file is opened source file character set encoding declarations finallyunicode escape codes are fine for the occasional unicode character in string literalsbut they can become tedious if you need to embed non-ascii text in your coding unicode strings
1,742
the text of your script filespython uses the utf- encoding by defaultbut it allows you to change this to support arbitrary character sets by including comment that names your desired encoding the comment must be of this form and must appear as either the first or second line in your script in either python or -*codinglatin- -*when comment of this form is presentpython will recognize strings represented natively in the given encoding this means you can edit your script file in text editor that accepts and displays accented and other non-ascii characters correctlyand python will decode them correctly in your string literals for examplenotice how the comment at the top of the following filetext pyallows latin- characters to be embedded in stringswhich are themselves embedded in the script file' text-*codinglatin- -*any of the following string literal forms work in latin- changing the encoding above to either ascii or utf- failsbecause the xc and xe in mystr are not valid in either mystr 'aabecmystr ' \ \ cmystr 'achr( xc 'bchr( xe 'cimport sys print('default encoding:'sys getdefaultencoding()for astr in mystr mystr mystr print('{ }strlen={ }format(astrlen(astr))end=''bytes astr encode(bytes astr encode('latin- '#bytes astr encode('ascii'per default utf- bytes for non-ascii one byte per char ascii failsoutside range print('byteslen ={ }byteslen ={ }format(len(bytes )len(bytes ))when runthis script produces the following outputgivingfor each of three coding techniquesthe stringits lengthand the lengths of its utf- and latin- encoded byte string forms :\codec:\python \python text py default encodingutf- aabecstrlen= byteslen = byteslen = aabecstrlen= byteslen = byteslen = aabecstrlen= byteslen = byteslen = since many programmers are likely to fall back on the standard utf- encodingi'll defer to python' standard manual set for more details on this option and other advanced unicode support topicssuch as properties and character name escapes in strings ' omitting here for this let' take quick look at the new byte string object types in python xbefore moving on to its file and tool changes unicode and byte strings
1,743
file declarationssee the currency symbols used in the money formatting example of as well as its associated file in this book' examples packageformats_currency py the latter requires source-file declaration to be usable by pythonbecause it embeds non-ascii currency symbol characters this example also illustrates the portability gains possible when using ' unicode literal in code in and later using bytes objects we studied wide variety of operations available for python ' general str string type in the basic string type works identically in and xso we won' rehash this topic insteadlet' dig bit deeper into the operation sets provided by the new bytes type in as mentioned previouslythe bytes object is sequence of small integerseach of which is in the range through that happens to print as ascii characters when displayed it supports sequence operations and most of the same methods available on str objects (and present in ' str typehoweverbytes does not support the for mat method or the formatting expressionand you cannot mix and match bytes and str type objects without explicit conversions--you generally will use all str type objects and text files for text dataand all bytes type objects and binary files for binary data method calls if you really want to see what attributes str has that bytes doesn'tyou can always check their dir built-in function results the output can also tell you something about the expression operators they support ( __mod__ and __rmod__ implement the operator) :\codec:\python \python attributes in str but not bytes set(dir('abc')set(dir( 'abc'){'isdecimal''__mod__''__rmod__''format_map''isprintable''casefold''format''isnumeric''isidentifier''encode'attributes in bytes but not str set(dir( 'abc')set(dir('abc'){'decode''fromhex'as you can seestr and bytes have almost identical functionality their unique attributes are generally methods that don' apply to the otherfor instancedecode translates raw bytes into its str representationand encode translates string into its raw bytes representation most of the methods are the samethough bytes methods require bytes arguments (again string types don' mixalso recall that bytes objects are using bytes objects
1,744
shortened for brevity) 'spamb find( 'pa' bbytes literal replace( 'pa' 'xy' 'sxymbytes methods expect bytes arguments split( 'pa'[ ' ' ' 'bytes methods return bytes results 'spamb[ 'xtypeerror'bytesobject does not support item assignment one notable difference is that string formatting works only on str objects in xnot on bytes objects (see for more on string formatting expressions and methods)'% ' '% typeerrorunsupported operand type(sfor %'bytesand 'int'{ }format( ' '{ }format( attributeerror'bytesobject has no attribute 'formatsequence operations besides method callsall the usual generic sequence operations you know (and possibly lovefrom python strings and lists work as expected on both str and bytes in xthis includes indexingslicingconcatenationand so on notice in the following that indexing bytes object returns an integer giving the byte' binary valuebytes really is sequence of -bit integersbut for convenience prints as string of ascii-coded characters where possible when displayed as whole to check given byte' valueuse the chr built-in to convert it back to its characteras in the followingb 'spamb 'spama sequence of small ints prints as ascii characters (and/or hex escapesb[ [- indexing yields an int chr( [ ]'sshow character for int unicode and byte strings
1,745
[ show all the byte' int values [ :] [:- ( 'pam' 'spa'len( 'lmnb'spamlmnb 'spamspamspamspamother ways to make bytes objects so farwe've been mostly making bytes objects with the bliteral syntax we can also create them by calling the bytes constructor with str and an encoding namecalling the bytes constructor with an iterable of integers representing byte valuesor encoding str object per the default (or passed-inencoding as we've seenencoding takes text str and returns the raw encoded byte values of the string per the encoding specifiedconverselydecoding takes raw bytes sequence and translates it to its str text string representation-- series of unicode characters both operations create new string objectsb 'abcb 'abcliteral bytes('abc''ascii' 'abcconstructor with encoding name ord(' ' bytes([ ] 'abcb 'spamencode( 'spams decode( 'spaminteger iterable str encode((or bytes()bytes decode((or str()from functional perspectivethe last two of these operations are really tools for converting between str and bytesa topic introduced earlier and expanded upon in the next section using bytes objects
1,746
in the replace call of the section "method callson page we had to pass in two bytes objects--str types won' work there although python automatically converts str to and from unicode when possible ( when the str is -bit ascii text)python requires specific string types in some contexts and expects manual conversions if neededmust pass expected types to function and method calls 'spamb replace('pa''xy'typeerrorexpected an object with the buffer interface replace( 'pa' 'xy' 'sxymb 'spamb replace(bytes('pa')bytes('xy')typeerrorstring argument without an encoding replace(bytes('pa''ascii')bytes('xy''utf- ') 'sxymmust convert manually in mixed-type expressions 'ab'cdtypeerrorcan' concat bytes to str 'abdecode('cd'abcdb'ab'cdencode( 'abcdb'abbytes('cd''ascii' 'abcdbytes to str str to bytes str to bytes although you can create bytes objects yourself to represent packed binary datathey can also be made automatically by reading files opened in binary modeas we'll see in more detail later in this firstthoughlet' introduce bytes' very closeand mutablecousin using / bytearray objects so far we've focused on str and bytesbecause they subsume python ' unicode and str python grew third string typethough--bytearraya mutable sequence of integers in the range through which is mutable variant of bytes as suchit supports the same string methods and sequence operations as bytesas well as many of the mutable in-place-change operations supported by lists unicode and byte strings
1,747
of text such as asciiwhich can be represented with byte per character (richer unicode text generally requires unicode stringswhich are still immutablethe bytear ray type is also available in python and as back-port from xbut it does not enforce the strict text/binary distinction there that it does in bytearrays in action let' take quick tour we can create bytearray objects by calling the bytearray builtin in python xany string may be used to initializecreation in mutable sequence of small ( ints 'spamc bytearray(sc bytearray( 'spam' back-port from in =in (strin python xan encoding name or byte string is requiredbecause text and binary strings do not mix (though byte strings may reflect encoded unicode text)creation in xtext/binary do not mix 'spamc bytearray(stypeerrorstring argument without an encoding bytearray( 'latin ' bytearray( 'spam' content-specific type in 'spamc bytearray(bc bytearray( 'spam' !in (bytes/stronce createdbytearray objects are sequences of small integers like bytes and are mutable like liststhough they require an integer for index assignmentsnot string (all of the following is continuation of this session and is run under python unless otherwise noted--see comments for usage notes)mutablebut must assign intsnot strings [ [ 'xtypeerroran integer is required [ 'xtypeerroran integer is required this and the next work in [ ord(' ' use ord(to get character' ordinal using / bytearray objects
1,748
[ ' '[ bytearray( 'xyam'or index byte string processing bytearray objects borrows from both strings and listssince they are mutable byte strings while the byterrray' methods overlap with both str and bytesit also has many of the list' mutable methods besides named methodsthe __iadd__ and __setitem__ methods in bytearray implement +in-place concatenation and index assignmentrespectivelyin bytes but not bytearray set(dir( 'abc')set(dir(bytearray( 'abc')){'__getnewargs__'in bytearray but not bytes set(dir(bytearray( 'abc'))set(dir( 'abc'){'__iadd__''reverse''__setitem__''extend''copy''__alloc__''__delitem__''__imul__''remove''clear''insert''append''pop'you can change bytearray in place with both index assignmentas you've just seenand list-like methods like those shown here (to change text in place prior to you would need to convert to and then from listwith list(strand 'join(list)--see and for examples)mutable method calls bytearray( 'xyam' append( 'lmn'typeerroran integer is required requires string of size append(ord(' ') bytearray( 'xyaml' extend( 'mno' bytearray( 'xyamlmno'all the usual sequence operations and string methods work on bytearraysas you would expect (notice that like bytes objectstheir expressions and methods expect bytes argumentsnot str arguments)sequence operations and string methods bytearray( 'xyamlmno' '!#bytearray( 'xyamlmno!#' [ unicode and byte strings
1,749
bytearray( 'yamlmno'len( replace('xy''sp'this works in typeerrortype str doesn' support the buffer api replace( 'xy' 'sp'bytearray( 'spamlmno' bytearray( 'xyamlmno' bytearray( 'xyamlmnoxyamlmnoxyamlmnoxyamlmno'python string types summary finallyby way of summarythe following examples demonstrate how bytes and byte array objects are sequences of intsand str objects are sequences of charactersbinary versus text 'spamlist( [ is same as in bytearray( 'xyamlmno'list( [ 'spamlist( [' '' '' '' 'although all three python string types can contain character values and support many of the same operationsagainyou should alwaysuse str for textual data use bytes for binary data use bytearray for binary data you wish to change in place related tools such as filesthe next section' topicoften make the choice for you using text and binary files this section expands on the impact of python ' string model on the file processing basics introduced earlier in the book as mentioned earlierthe mode in which you open file is crucial--it determines which object type you will use to represent the file' using text and binary files
1,750
objectstext-mode files interpret file contents according to unicode encoding--either the default for your platformor one whose name you pass in by passing in an encoding name to openyou can force conversions for various types of unicode files textmode files also perform universal line-end translationsby defaultall line-end forms map to the single '\ncharacter in your scriptregardless of the platform on which you run it as described earliertext files also handle reading and writing the byte order mark (bomstored at the start-of-file in some unicode encoding schemes binary-mode files instead return file content to you rawas sequence of integers representing byte valueswith no encoding or decoding and no line-end translations the second argument to open determines whether you want text or binary processingjust as it does in python--adding to this string implies binary mode ( "rbto read binary data filesthe default mode is "rt"this is the same as " "which means text input (just as in xin xthoughthis mode argument to open also implies an object type for file content representationregardless of the underlying platform--text files return str for reads and expect one for writesbut binary files return bytes for reads and expect one (or bytearrayfor writes text file basics to demonstratelet' begin with basic file / as long as you're processing basic text files ( asciiand don' care about circumventing the platform-default encoding of stringsfiles in look and feel much as they do in (for that matterso do strings in generalthe followingfor instancewrites one line of text to file and reads it back in xexactly as it would in (note that file is no longer built-in name in xso it' perfectly ok to use it as variable here) :\codec:\python \python basic text files (and stringswork the same as in file open('temp'' 'size file write('abc\ 'file close(file open('temp'text file read(text 'abc\nprint(textabc unicode and byte strings returns number of characters written manual close to flush output buffer default mode is " (="rt")text input
1,751
in python xthere is no major distinction between text and binary files--both accept and return content as str strings the only major difference is that text files automatically map \ end-of-line characters to and from \ \ on windowswhile binary files do not ( ' stringing operations together into one-liners here just for brevity) :\codec:\python \python open('temp'' 'write('abd\ 'open('temp'' 'read('abd\nopen('temp''rb'read('abd\ \nopen('temp''wb'write('abc\ 'open('temp'' 'read('abc\nopen('temp''rb'read('abc\nwrite in text modeadds \ read in text modedrops \ read in binary modeverbatim write in binary mode \ not expanded to \ \ in python xthings are bit more complex because of the distinction between str for text data and bytes for binary data to demonstratelet' write text file and read it back in both modes in notice that we are required to provide str for writingbut reading gives us str or bytesdepending on the open modec:\codec:\python \python write and read text file open('temp'' 'write('abc\ ' open('temp'' 'read('abc\nopen('temp''rb'read( 'abc\ \ntext mode outputprovide str text mode inputreturns str binary mode inputreturns bytes notice how on windows text-mode files translate the \ end-of-line character to \ \ on outputon inputtext mode translates the \ \ back to \nbut binary-mode files do not this is the same in xand it' normally what we want--text files should for portability map end-of-line markers to and from \ (which is what is actually present in files in linuxwhere no mapping occurs)and such translations should never occur for binary data (where end-of-line bytes are irrelevantalthough you can control this behavior with extra open arguments in if desiredthe default usually works well now let' do the same againbut with binary file we provide bytes to write in this caseand we still get back str or bytesdepending on the input modewrite and read binary file open('temp''wb'write( 'abc\ ' open('temp'' 'read('abc\nopen('temp''rb'read( 'abc\nbinary mode outputprovide bytes text mode inputreturns str binary mode inputreturns bytes using text and binary files
1,752
--againa desired result for binary data type requirements and file behavior are the same even if the data we're writing to the binary file is truly binary in nature in the followingfor examplethe "\ is binary zero byte and not printable characterwrite and read truly binary data open('temp''wb'write( ' \ ' open('temp'' 'read(' \ copen('temp''rb'read( ' \ cprovide bytes receive str receive bytes binary-mode files always return contents as bytes objectbut accept either bytes or bytearray object for writingthis naturally followsgiven that bytearray is basically just mutable variant of bytes in factmost apis in python that accept bytes also allow bytearraybytearrays work too ba bytearray( '\ \ \ 'open('temp''wb'write(ba open('temp'' 'read('\ \ \ open('temp''rb'read( '\ \ \ type and content mismatches in notice that you cannot get away with violating python' str/bytes type distinction when it comes to files as the following examples illustratewe get errors (shortened hereif we try to write bytes to text file or str to binary file (the exact text of the error messages here is prone to change)types are not flexible for file content open('temp'' 'write('abc\ ' open('temp'' 'write( 'abc\ 'typeerrormust be strnot bytes text mode makes and requires str open('temp''wb'write( 'abc\ 'binary mode makes and requires bytes open('temp''wb'write('abc\ 'typeerror'strdoes not support the buffer interface this makes sensetext has no meaning in binary termsbefore it is encoded although it is often possible to convert between the types by encoding str and decoding bytesas described earlier in this you will usually want to stick to either str for text data or bytes for binary data because the str and bytes operation sets largely intersectthe choice won' be much of dilemma for most programs (see the string tools coverage in the final section of this for some prime examples of this unicode and byte strings
1,753
in addition to type constraintsfile content can matter in text-mode output files require str instead of bytes for contentso there is no way in to write truly binary data to text-mode file depending on the encoding rulesbytes outside the default character set can sometimes be embedded in normal stringand they can always be written in binary mode (some of the following raise errors when displaying their string results in pythons prior to but the file operations work successfully)can' read truly binary data in text mode chr( xff'ychr( xfe'\xfeff is valid charfe is not an error in some pythons open('temp'' 'write( '\xff\xfe\xfd'typeerrormust be strnot bytes can' use arbitrary bytesopen('temp'' 'write('\xff\xfe\xfd' open('temp''wb'write( '\xff\xfe\xfd' can write if embeddable in str open('temp''rb'read( '\xff\xfe\xfdcan always read as binary bytes open('temp'' 'read(' \xfe\xfdcan' read text unless decodablean error in some pythons can also write in binary mode in generalhoweverbecause text-mode input files in must be able to decode content per unicode encodingthere is no way to read truly binary data in text modeas the next section explains using unicode files so farwe've been reading and writing basic text and binary files it turns out to be easy to read and write unicode text stored in files toobecause the open call accepts an encoding for text filesand arranges to run the required encoding and decoding for us automatically as data is transferred this allows us to process variety of unicode text created with different encodings than the default for the platformand store the same text in different encodings for different purposes reading and writing unicode in in factwe can effectively convert string to different encoded forms both manually with method calls as we did earlierand automatically on file input and output we'll use the following unicode string in this section to demonstratec:\codec:\python \python ' \xc \xe cs 'aabecfive-character decoded stringnon-ascii using unicode files
1,754
manual encoding as we've already learnedwe can always encode such string to raw bytes according to the target encoding nameencode manually with methods encode('latin- ' ' \xc \xe clen( encode('utf- ' ' \xc \ \xc \xa clen( bytes when encoded as latin- bytes when encoded as utf- file output encoding nowto write our string to text file in particular encodingwe can simply pass the desired encoding name to open--although we could manually encode first and write in binary modethere' no need toencoding automatically when written open('latindata'' 'encoding='latin- 'write( open('utf data'' 'encoding='utf- 'write( write as latin- write as utf- open('latindata''rb'read( ' \xc \xe cread raw bytes open('utf data''rb'read( ' \xc \ \xc \xa cdifferent in files file input decoding similarlyto read arbitrary unicode datawe simply pass in the file' encoding type name to openand it decodes from raw bytes to strings automaticallywe could read raw bytes and decode manually toobut that can be tricky when reading in blocks (we might read an incomplete character)and it isn' necessarydecoding automatically when read open('latindata'' 'encoding='latin- 'read('aabecopen('utf data'' 'encoding='utf- 'read('aabecx open('latindata''rb'read( decode('latin- ' unicode and byte strings decoded on input per encoding type manual decodingnot necessary
1,755
open('utf data''rb'read( decode('aabecutf- is default decoding mismatches finallykeep in mind that this behavior of files in limits the kind of content you can load as text as suggested in the prior sectionpython really must be able to decode the data in text files into str stringaccording to either the default or passedin unicode encoding name trying to open truly binary data file in text modefor exampleis unlikely to work in even if you use the correct object typesfile open( ' :\python \python exe'' 'text file read(unicodedecodeerror'charmapcodec can' decode byte in position file open( ' :\python \python exe''rb'data file read(data[: 'mz\ \ \ \ \ \ \ \ \ \ \xff\xff\ \ \xb \ \ \ the first of these examples might not fail in python (normal files do not decode text)even though it probably shouldreading the file may return corrupted data in the stringdue to automatic end-of-line translations in text mode (any embedded \ \ bytes will be translated to \ on windows when readto treat file content as unicode text in xwe need to use special tools instead of the general open built-in functionas we'll see in moment firstthoughlet' turn to more explosive topic handling the bom in as described earlier in this some encoding schemes store special byte order marker (bomsequence at the start of filesto specify data endianness (which end of string of bits is most significant to its valueor declare the encoding type python both skips this marker on input and writes it on output if the encoding name implies itbut we sometimes must use specific encoding name to force bom processing explicitly for examplein the utf- and utf- encodingsthe bom specifies bigor littleendian format utf- text file may also include bombut this isn' guaranteedand serves only to declare that it is utf- in general when reading and writing data using these encoding schemespython automatically skips or writes the bom if it is either implied by general encoding nameor if you provide more specific encoding name to force the issue for instancein utf- the bom is always processed for "utf- ,and the more specific encoding name "utf- -ledenotes little-endian format in utf- the more specific encoding "utf- -sigforces python to both skip and write bom on input and outputrespectivelybut the general "utf- does not using unicode files
1,756
let' make some files with boms to see how this works in practice when you save text file in windows notepadyou can specify its encoding type in drop-down list-simple ascii textutf- or littleor big-endian utf- if two-line text file named spam txt is saved in notepad as the encoding type ansifor instanceit' written as simple ascii text without bom when this file is read in binary mode in pythonwe can see the actual bytes stored in the file when it' read as textpython performs endof-line translation by defaultwe can also decode it as explicit utf- text since ascii is subset of this scheme (and utf- is python ' default encoding) :\codec:\python \python file saved in notepad import sys sys getdefaultencoding('utf- open('spam txt''rb'read(ascii (utf- text file 'spam\ \nspam\ \nopen('spam txt'' 'read(text mode translates line end 'spam\nspam\nopen('spam txt'' 'encoding='utf- 'read('spam\nspam\nif this file is instead saved as utf- in notepadit is prepended with -byte utf- bom sequenceand we need to give more specific encoding name ("utf- -sig"to force python to skip the markeropen('spam txt''rb'read(utf- with -byte bom '\xef\xbb\xbfspam\ \nspam\ \nopen('spam txt'' 'read(' >>?spam\nspam\nopen('spam txt'' 'encoding='utf- 'read('\ufeffspam\nspam\nopen('spam txt'' 'encoding='utf- -sig'read('spam\nspam\nif the file is stored as unicode big endian in notepadwe get utf- -format data in the filewith -byte ( -bitcharacters prepended with -byte bom sequence--the encoding name "utf- in python skips the bom because it is implied (since all utf- files have bom)and "utf- -behandles the big-endian format but does not skip the bom (the second of the following fails to print on older pythons)open('spam txt''rb'read( '\xfe\xff\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \nopen('spam txt'' 'read('\xfey\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \nopen('spam txt'' 'encoding='utf- 'read('spam\nspam\nopen('spam txt'' 'encoding='utf- -be'read('\ufeffspam\nspam\nnotepad' "unicode,by the wayis utf- little endian (whichof courseis one of very many kinds of unicode encoding! unicode and byte strings
1,757
the same patterns generally hold true for output when writing unicode file in python codewe need more explicit encoding name to force the bom in utf- --"utf- does not write (or skipthe bombut "utf- -sigdoesopen('temp txt'' 'encoding='utf- 'write('spam\nspam\ ' open('temp txt''rb'read(no bom 'spam\ \nspam\ \nopen('temp txt'' 'encoding='utf- -sig'write('spam\nspam\ ' open('temp txt''rb'read(wrote bom '\xef\xbb\xbfspam\ \nspam\ \nopen('temp txt'' 'read(' >>?spam\nspam\nopen('temp txt'' 'encoding='utf- 'read('\ufeffspam\nspam\nopen('temp txt'' 'encoding='utf- -sig'read('spam\nspam\nkeeps bom skips bom notice that although "utf- does not drop the bomdata without bom can be read with both "utf- and "utf- -sig"--use the latter for input if you're not sure whether bom is present in file (and don' read this paragraph out loud in an airport security line!)open('temp txt'' 'write('spam\nspam\ ' open('temp txt''rb'read( 'spam\ \nspam\ \nopen('temp txt'' 'read('spam\nspam\nopen('temp txt'' 'encoding='utf- 'read('spam\nspam\nopen('temp txt'' 'encoding='utf- -sig'read('spam\nspam\ndata without bom either utf- works finallyfor the encoding name "utf- ,the bom is handled automaticallyon outputdata is written in the platform' native endiannessand the bom is always writtenon inputdata is decoded per the bomand the bom is always stripped because it' standard in this schemesys byteorder 'littleopen('temp txt'' 'encoding='utf- 'write('spam\nspam\ ' open('temp txt''rb'read( '\xff\xfes\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ open('temp txt'' 'encoding='utf- 'read('spam\nspam\nusing unicode files
1,758
may have to manually write and skip the bom yourself in some scenarios if it is required or present--study the following examples for more bom-making instructionsopen('temp txt'' 'encoding='utf- -be'write('\ufeffspam\nspam\ ' open('spam txt''rb'read( '\xfe\xff\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \nopen('temp txt'' 'encoding='utf- 'read('spam\nspam\nopen('temp txt'' 'encoding='utf- -be'read('\ufeffspam\nspam\nthe more specific utf- encoding names work fine with bom-less filesthough "utf- requires one on input in order to determine byte orderopen('temp txt'' 'encoding='utf- -le'write('spam' open('temp txt''rb'read(ok if bom not present or expected ' \ \ \ \ open('temp txt'' 'encoding='utf- -le'read('spamopen('temp txt'' 'encoding='utf- 'read(unicodeerrorutf- stream does not start with bom experiment with these encodings yourself or see python' library manuals for more details on the bom unicode files in the preceding discussion applies to python ' string types and files you can achieve similar effects for unicode files in xbut the interface is different howeverif you replace str with unicode and open with codecs openthe result is essentially the same in xc:\codec:\python \python ' \xc \xe cprint aabec len( encode('latin- '' \xc \xe cs encode('utf- '' \xc \ \xc \xa cimport codecs codecs open('latindata'' 'encoding='latin- 'write(scodecs open('utfdata'' 'encoding='utf- 'write(sopen('latindata''rb'read(' \xc \xe copen('utfdata''rb'read(' \xc \ \xc \xa unicode and byte strings type manual calls files writes encode
1,759
' \xc \xe ccodecs open('utfdata'' 'encoding='utf- 'read( ' \xc \xe cprint codecs open('utfdata'' 'encoding='utf- 'read(aabec reads decode print to view for more unicode detailssee earlier sections of this and python manuals unicode filenames and streams in closingthis section has focused on the encoding and decoding of unicode text file contentbut python also supports the notion of non-ascii file names in factthey are independent settings in syswhich can vary per python version and platform ( returns ascii for the first of the following on windows)import sys sys getdefaultencoding()sys getfilesystemencoding(('utf- ''mbcs'file contentnames filenamestext versus bytes filename encoding is often nonissue in shortfor filenames given as unicode text stringsthe open call encodes automatically to and from the underlying platform' filename conventions passing arbitrarily pre-encoded filenames as byte strings to file tools (including open and directory walkers and listersoverrides automatic encodingsand forces filename results to be returned in encoded byte string form too--useful if filenames are undecodable per the underlying platform' conventions ( ' using windowsbut some of the following may fail on other platforms) open('xxx\ '' ' write('\xa \ ' close(print(open('xxx\ 'read() = print(open( 'xxx\xa 'read() = import glob glob glob('*\ *'['xxxy='glob glob( '*\xa *'[ 'xxx\xa 'non-ascii filename writes five characters textauto-encoded bytespre-encoded filename expansion tool get decoded text for decoded text get encoded bytes for encoded bytes stream contentpythonioencoding in additionthe environment variable pythonioencoding can be used to set the encoding used for text in the standard streams--inputoutputand error this setting overrides python' default encoding for printed textwhich on windows currently uses winusing unicode files
1,760
utf- may sometimes be required to print non-ascii textand to display such text in shell windows (possibly in conjunction with code page changes on some windows machinesa script that prints non-ascii filenamesfor examplemay fail unless this setting is made for more background on this subjectsee also "currency symbolsunicode in actionin therewe work through an example that demonstrates the essentials of portable unicode codingas well as the roles and requirements of pythonioencoding settingswhich we won' rehash here for more on these topics in generalsee python manuals or books such as programming python th edition (or laterif later may bethe latter of these digs deeper into streams and files from an applications-level perspective other string tool changes in many of the other popular string-processing tools in python' standard library have also been revamped for the new str/bytes type dichotomy we won' cover any of these application-focused tools in much detail in this core language bookbut to wrap up this here' quick look at four of the major tools impactedthe re patternmatching modulethe struct binary data modulethe pickle object serialization moduleand the xml package for parsing xml text as noted aheadother python toolssuch as its json modulediffer in ways similar to those presented here the re pattern-matching module python' re pattern-matching module supports text processing that is more general than that afforded by simple string method calls such as findsplitand replace with restrings that designate searching and splitting targets can be described by general patternsinstead of absolute text this module has been generalized to work on objects of any string type in --strbytesand bytearray--and returns result substrings of the same type as the subject string in it supports both unicode and str here it is at work in xextracting substrings from line of text--borrowedof coursefrom monty python' the meaning of life within pattern strings*means any character (the )zero or more times (the *)saved away as matched substring (the ()parts of the string matched by the parts of pattern enclosed in parentheses are available after successful matchvia the group or groups methodc:\codec:\python \python import re 'bugger all down here on earth! 'bugger all down here on earth!line of text usually from file re match('*down *on *)'sgroups(('bugger all''here''earth!'match line to pattern matched substrings unicode and byte strings
1,761
( 'bugger all' 'here' 'earth!'bytes substrings in python results are similarbut the unicode type is used for non-ascii textand str handles both -bit and binary textc:\codec:\python \python import re 'bugger all down here on earth! 'bugger all down here on earth!simple text and binary unicode text re match('*down *on *)'sgroups(('bugger all''here''earth!'re match('*down *on *)'ugroups(( 'bugger all' 'here' 'earth!'since bytes and str support essentially the same operation setsthis type distinction is largely transparent but note thatlike in other apisyou can' mix str and bytes types in its callsarguments in (although if you don' plan to do pattern matching on binary datayou probably don' need to care) :\codec:\python \python import re 'bugger all down here on earth! 'bugger all down here on earth!re match('*down *on *)'bgroups(typeerrorcan' use string pattern on bytes-like object re match( '*down *on *)'sgroups(typeerrorcan' use bytes pattern on string-like object re match( '*down *on *)'bytearray( )groups((bytearray( 'bugger all')bytearray( 'here')bytearray( 'earth!')re match('*down *on *)'bytearray( )groups(typeerrorcan' use string pattern on bytes-like object the struct binary data module the python struct moduleused to create and extract packed binary data from stringsalso works the same in as it does in xbut in packed data is represented as bytes and bytearray objects onlynot str objects (which makes sensegiven that it' intended for processing binary datanot decoded text)and "sdata code values must be bytes as of (the former str utf- auto-encode is droppedhere are both pythons in actionpacking three objects into string according to binary type specification (they create -byte integera -byte stringand -byte integer) :\codec:\python \python from struct import pack pack('> sh' 'spam' bytes in ( -bit stringsother string tool changes in
1,762
:\codec:\python \python from struct import pack pack('> sh' 'spam' '\ \ \ \ spam\ \ str in ( -bit stringssince bytes has an almost identical interface to that of str in and xthoughmost programmers probably won' need to care--the change is irrelevant to most existing codeespecially since reading from binary file creates bytes automatically although the last test in the following example fails on type mismatchmost scripts will read binary data from filenot create it as string as we do herec:\codec:\python \python import struct struct pack('> sh' 'spam' '\ \ \ \ spam\ \ vals struct unpack('> sh'bvals ( 'spam' vals struct unpack('> sh' decode()typeerror'strdoes not support the buffer interface apart from the new syntax for bytescreating and reading binary files works almost the same in as it does in stillcode like this is one of the main places where programmers will notice the bytes object typec:\codec:\python \python write values to packed binary file open('data bin''wb'import struct data struct pack('> sh' 'spam' data '\ \ \ \ spam\ \ write(data close(read values from packed binary file open('data bin''rb'data read(data '\ \ \ \ spam\ \ values struct unpack('> sh'datavalues ( 'spam' open binary output file create packed binary data bytes in xnot str write to the file open binary input file read bytes extract packed binary data back to python objects once you've extracted packed binary data into python objects like thisyou can dig even further into the binary world if you have to--strings can be indexed and sliced to get individual bytesvaluesindividual bits can be extracted from integers with bitwise operatorsand so on (see earlier in this book for more on the operations applied here) unicode and byte strings
1,763
( 'spam' accessing bits of parsed integers bin(values[ ]' values[ values[ bin(values[ ' bin(values[ ' bool(values[ true bool(values[ false result of struct unpack can get to bits in ints test first (lowestbit in int bitwise orturn bits on decimal is binary bitwise xoroff if both true test if bit is on test if bit is set since parsed bytes strings are sequences of small integerswe can do similar processing with their individual bytesaccessing bytes of parsed strings and bits within them values[ 'spamvalues[ ][ bytes stringsequence of ints values[ ][ :prints as ascii characters 'pambin(values[ ][ ]can get to bits of bytes in strings ' bin(values[ ][ turn bits on ' values[ ][ of coursemost python programmers don' deal with binary bitspython has higherlevel object typeslike lists and dictionaries that are generally better choice for representing information in python scripts howeverif you must use or produce lowerlevel data used by programsnetworking librariesor other interfacespython has tools to assist the pickle object serialization module we met the pickle module briefly in and in we also used the shelve modulewhich uses pickle internally for completeness herekeep in mind that the python version of the pickle module always creates bytes objectregardless of the default or passed-in "protocol(data format levelyou can see this by using the module' dumps call to return an object' pickle stringc:\codec:\python \python import pickle dumps(returns pickle string other string tool changes in
1,764
'\ \ ] \ ( \ \ \ python default protocol= =binary pickle dumps([ ]protocol= '(lp \nl \nal \nal \na ascii protocol but still bytesthis implies that files used to store pickled objects must always be opened in binary mode in python xsince text files use str strings to represent datanot bytes--the dump call simply attempts to write the pickle string to an open output filepickle dump([ ]open('temp'' ')typeerrormust be strnot bytes text files fail on bytesdespite protocol value pickle dump([ ]open('temp'' ')protocol= typeerrormust be strnot bytes pickle dump([ ]open('temp''wb')always use binary in open('temp'' 'read('\ ac\ ] \ ( \ \ \ this worksbut just by luck notice the last result here didn' issue an error in text mode only because the stored binary data was compatible with the windows platform' utf- default decoderthis was really just luck (and in factthis command failed when printing in older pythonsand may fail on other platformsbecause pickle data is not generally decodable unicode textthe same rule holds on input--correct usage in requires always both writing and reading pickle data in binary modeswhether unpickling or notpickle dump([ ]open('temp''wb')pickle load(open('temp''rb')[ open('temp''rb'read( '\ \ ] \ ( \ \ \ in python xwe can get by with text-mode files for pickled dataas long as the protocol is level (the default in xand we use text mode consistently to convert line endsc:\codec:\python \python import pickle pickle dumps([ ]'(lp \ni \nai \nai \na python default= =ascii pickle dumps([ ]protocol= '] \ ( \ \ \ pickle dump([ ]open('temp'' ')pickle load(open('temp')[ open('temp'read('(lp \ni \nai \nai \na unicode and byte strings text mode works in
1,765
their version-specific defaultsalways use binary-mode files for pickled data--the following works the same in python and ximport pickle pickle dump([ ]open('temp''wb')pickle load(open('temp''rb')[ version neutral and required in because almost all programs let python pickle and unpickle objects automatically and do not deal with the content of pickled data itselfthe requirement to always use binary file modes is the only significant incompatibility in python ' newer pickling model see reference books or python' manuals for more details on object pickling xml parsing tools xml is tag-based language for defining structured informationcommonly used to define documents and data shipped over the web although some information can be extracted from xml text with basic string methods or the re pattern modulexml' nesting of constructs and arbitrary attribute text tend to make full parsing more accurate because xml is such pervasive formatpython itself comes with an entire package of xml parsing tools that support the sax and dom parsing modelsas well as package known as elementtree-- python-specific api for parsing and constructing xml beyond basic parsingthe open source domain provides support for additional xml toolssuch as xpathxqueryxsltand more xml by definition represents text in unicode formto support internationalization although most of python' xml parsing tools have always returned unicode stringsin python their results have mutated from the unicode type to the general str string type--which makes sensegiven that ' str string is unicodewhether the encoding is ascii or other we can' go into many details herebut to sample the flavor of this domainsuppose we have simple xml text filemybooks xml ~ learning python programming python python pocket reference 'reilly media and we want to run script to extract and display the content of all the nested title tagsas followslearning python programming python python pocket reference other string tool changes in
1,766
like xpathfirstwe could run basic pattern matching on the file' textthough this tends to be inaccurate if the text is unpredictable where applicablethe re module we met earlier does the job--its match method looks for match at the start of stringsearch scans ahead for matchand the findall method used here locates all places where the pattern matches in the string (the result comes back as list of matched substrings corresponding to parenthesized pattern groupsor tuples of such for multiple groups)file patternparse py import re text open('mybooks xml'read(found re findall('*)'textfor title in foundprint(titlesecondto be more robustwe could perform complete xml parsing with the standard library' dom parsing support dom parses xml text into tree of objects and provides an interface for navigating the tree to extract tag attributes and valuesthe interface is formal specificationindependent of pythonfile domparse py from xml dom minidom import parsenode xmltree parse('mybooks xml'for node in xmltree getelementsbytagname('title')for node in node childnodesif node nodetype =node text_nodeprint(node dataas third optionpython' standard library supports sax parsing for xml under the sax modela class' methods receive callbacks as parse progresses and use state information to keep track of where they are in the document and collect its datafile saxparse py import xml sax handler class bookhandler(xml sax handler contenthandler)def __init__(self)self intitle false def startelement(selfnameattributes)if name ='title'self intitle true def characters(selfdata)if self intitleprint(datadef endelement(selfname)if name ='title'self intitle false import xml sax parser xml sax make_parser(handler bookhandler( unicode and byte strings
1,767
parser parse('mybooks xml'finallythe elementtree system available in the etree package of the standard library can often achieve the same effects as xml dom parsersbut with remarkably less code it' python-specific way to both parse and generate xml textafter parseits api gives access to components of the documentfile etreeparse py from xml etree elementtree import parse tree parse('mybooks xml'for in tree findall('title')print( textwhen run in either or xall four of these scripts display the same printed resultc:\codec:\python \python domparse py learning python programming python python pocket reference :\codec:\python \python domparse py learning python programming python python pocket reference technicallythoughin some of these scripts produce unicode string objectswhile in all produce str stringssince that type includes unicode text (whether ascii or other) :\codec:\python \python from xml dom minidom import parsenode xmltree parse('mybooks xml'for node in xmltree getelementsbytagname('title')for node in node childnodesif node nodetype =node text_nodenode data 'learning python'programming python'python pocket referencec:\codec:\python \python same code 'learning pythonu'programming pythonu'python pocket referenceprograms that must deal with xml parsing results in nontrivial ways will need to account for the different object type in againthoughbecause all strings have nearly identical interfaces in both and xmost scripts won' be affected by the changetools available on unicode in are generally available on str in the major featother string tool changes in
1,768
regrettablygoing into further xml parsing details is beyond this book' scope if you are interested in text or xml parsingit is covered in more detail in the applicationsfocused follow-up book programming python for more details on restructpickleand xmlas well as the additional impacts of unicode on other library tools such as filename expansion and directory walkersconsult the webthe aforementioned book and othersand python' standard library manual for related topicsee also the json example in -- language-neutral data exchange formatwhose structure is very similar to python dictionaries and listsand whose strings are all unicode that differs in type between pythons and much the same as shown for xml here why you will careinspecting filesand much more as was updating this stumbled onto use case for some of its tools after saving formerly ascii html file in notepad as "utf , found that it had grown mystery non-ascii character along the way due to an apparent keyboard operator errorand would no longer work as ascii in text tools to find the bad characteri simply started pythondecoded the file' content from its utf- format via text mode fileand scanned character by character looking for the first byte that was not valid ascii character toof open('py -windows-launcher html'encoding='utf ' read(for (icin enumerate( )tryx encode(encoding='ascii'exceptprint(isys exc_info()[ ] with the bad character' index in handit' easy to slice the unicode string for more detailslen( [ : 'ugh \ cthit[ : 'trace through \ cthiafter fixingi could also open in binary mode to verify and explore actual undecoded file content furtherf open('py -windows-launcher html''rb' read( [ [: '\ \ < unicode and byte strings
1,769
convenient tactical tool in such casesand its file objects give you tangible window on your data when neededboth in scripts and interactive mode for more realistically scaled examples of unicode at worki suggest my other book programming python th edition (or laterthat book develops much larger programs than we can hereand has numerous up close and personal encounters with unicode along the wayin the context of filesdirectory walkersnetwork socketsguisemail content and headersweb page contentdatabasesand more though clearly an important topic in today' global software worldunicode is more mandatory than you might expectespecially in language like python xwhich elevates it to its core string and file typesthus bringing all its users into the unicode fold--ready or notsummary this explored in-depth the advanced string types available in python and for processing unicode text and binary data as we sawmany programmers use ascii text and can get by with the basic string type and its operations for more advanced applicationspython' string models fully support both richer unicode text (via the normal string type in and special type in xand byte-oriented data (represented with bytes type in and normal strings in xin additionwe learned how python' file object has mutated in to automatically encode and decode unicode text and deal with byte strings for binary-mode filesand saw similar utility for finallywe briefly met some text and binary data tools in python' libraryand sampled their behavior in and in the next we'll shift our focus to tool-builder topicswith look at ways to manage access to object attributes by inserting automatically run code before we move onthoughhere' set of questions to review what we've learned here this has been substantial so be sure to read the quiz answers eventually for more in-depth summary test your knowledgequiz what are the names and roles of string object types in python what are the names and roles of string object types in python what is the mapping between and string types how do python ' string types differ in terms of operations how can you code non-ascii unicode characters in string in what are the main differences between textand binary-mode files in python how would you read unicode text file that contains text in different encoding than the default for your platformtest your knowledgequiz
1,770
why is ascii text considered to be kind of unicode text how large an impact does python ' string types change have on your codetest your knowledgeanswers python has three string typesstr (for unicode textincluding ascii)bytes (for binary data with absolute byte values)and bytearray ( mutable flavor of bytesthe str type usually represents content stored on text fileand the other two types generally represent content stored on binary files python has two main string typesstr (for -bit text and binary dataand unicode (for possibly wider character unicode textthe str type is used for both text and binary file contentunicode is used for text file content that is generally more complex than -bit characters python (but not earlieralso has ' bytearray typebut it' mostly back-port and doesn' exhibit the sharp text/binary distinction that it does in the mapping from to string types is not directbecause ' str equates to both str and bytes in xand ' str equates to both str and unicode in the mutability of bytearray in is also unique in generalthoughunicode text is handled by str and unicodebyte-based data is handled by bytes and strand bytes and str can both handle some simpler types of text python ' string types share almost all the same operationsmethod callssequence operationsand even larger tools like pattern matching work the same way on the other handonly str supports string formatting operationsand bytear ray has an additional set of operations that perform in-place changes the str and bytes types also have methods for encoding and decoding textrespectively non-ascii unicode characters can be coded in string with both hex (\xnnand unicode (\unnnn\unnnnnnnnescapes on some machinessome non-ascii characters--certain latin- charactersfor example--can also be typed or pasted directly into codeand are interpreted per the utf- default or source code encoding directive comment in xtext-mode files assume their file content is unicode text (even if it' all asciiand automatically decode when reading and encode when writing with binary-mode filesbytes are transferred to and from the file unchanged the contents of text-mode files are usually represented as str objects in your scriptand the contents of binary files are represented as bytes (or bytearrayobjects textmode files also handle the bom for certain encoding types and automatically translate end-of-line sequences to and from the single \ character on input and output unless this is explicitly disabledbinary-mode files do not perform either of these steps python uses codecs open for unicode fileswhich encodes and decodes similarly ' open only translates line ends in text mode unicode and byte strings
1,771
simply pass the name of the file' encoding to the open built-in in (codecs open(in )data will be decoded per the specified encoding when it is read from the file you can also read in binary mode and manually decode the bytes to string by giving an encoding namebut this involves extra work and is somewhat error-prone for multibyte characters (you may accidentally read partial character sequence to create unicode text file in specific encoding formatpass the desired encoding name to open in (codecs open(in )strings will be encoded per the desired encoding when they are written to the file you can also manually encode string to bytes and write it in binary modebut this is usually extra work ascii text is considered to be kind of unicode textbecause its -bit range of values is subset of most unicode encodings for examplevalid ascii text is also valid latin- text (latin- simply assigns the remaining possible values in an -bit byte to additional charactersand valid utf- text (utf- defines variable-byte scheme for representing more charactersbut ascii characters are still represented with the same codesin single bytethis makes unicode backward-compatible with the mass of ascii text data in the world (though it also may have limited its options--self-identifying textfor instancemay have been difficult (though boms serve much the same role the impact of python ' string types change depends upon the types of strings you use for scripts that use simple ascii text on platforms with ascii-compatible default encodingsthe impact is probably minorthe str string type works the same in and in this case moreoveralthough string-related tools in the standard library such as restructpickleand xml may technically use different types in than in xthe changes are largely irrelevant to most programs because ' str and bytes and ' str support almost identical interfaces if you process unicode datathe toolset you need has simply moved from ' unicode and codecs open(to ' str and open if you deal with binary data filesyou'll need to deal with content as bytes objectssince they have similar interface to stringsthoughthe impact should again be minimal that saidthe update of the book programming python for ran across numerous cases where unicode' mandatory status in implied changes in standard library apis--from networking and guisto databases and email in generalunicode will probably impact most users eventually test your knowledgeanswers
1,772
managed attributes this expands on the attribute interception techniques introduced earlierintroduces anotherand employs them in handful of larger examples like everything in this part of the bookthis is classified as an advanced topic and optional readingbecause most applications programmers don' need to care about the material discussed here--they can fetch and set attributes on objects without concern for attribute implementations especially for tools buildersthoughmanaging attribute access can be an important part of flexible apis moreoveran understanding of the descriptor model covered here can make related tools such as slots and properties more tangibleand may even be required reading if it appears in code you must use why manage attributesobject attributes are central to most python programs--they are where we often store information about the entities our scripts process normallyattributes are simply names for objectsa person' name attributefor examplemight be simple stringfetched and set with basic attribute syntaxperson name person name value fetch attribute value change attribute value in most casesthe attribute lives in the object itselfor is inherited from class from which it derives that basic model suffices for most programs you will write in your python career sometimesthoughmore flexibility is required suppose you've written program to use name attribute directlybut then your requirements change--for exampleyou decide that names should be validated with logic when set or mutated in some way when fetched it' straightforward to code methods to manage access to the attribute' value (valid and transform are abstract here)class persondef getname(self)
1,773
raise typeerror('cannot fetch name'elsereturn self name transform(def setname(selfvalue)if not valid(value)raise typeerror('cannot change name'elseself name transform(valueperson person(person getname(person setname('value'howeverthis also requires changing all the places where names are used in the entire program-- possibly nontrivial task moreoverthis approach requires the program to be aware of how values are exportedas simple names or called methods if you begin with method-based interface to dataclients are immune to changesif you do notthey can become problematic this issue can crop up more often than you might expect the value of cell in spreadsheet-like programfor instancemight begin its life as simple discrete valuebut later mutate into an arbitrary calculation since an object' interface should be flexible enough to support such future changes without breaking existing codeswitching to methods later is less than ideal inserting code to run on attribute access better solution would allow you to run code automatically on attribute accessif needed that' one of the main roles of managed attributes--they provide ways to add attribute accessor logic after the fact more generallythey support arbitrary attribute usage modes that go beyond simple data storage at various points in this bookwe've met python tools that allow our scripts to dynamically compute attribute values when fetching them and validate or change attribute values when storing them in this we're going to expand on the tools already introducedexplore other available toolsand study some larger use-case examples in this domain specificallythis presents four accessor techniquesthe __getattr__ and __setattr__ methodsfor routing undefined attribute fetches and all attribute assignments to generic handler methods the __getattribute__ methodfor routing all attribute fetches to generic handler method the property built-infor routing specific attribute access to get and set handler functions managed attributes
1,774
with arbitrary get and set handler methodsand the basis for other tools such as properties and slots the tools in the first of these bullets are available in all pythons the last three bulletstools are available in python and new-style classes in --they first appeared in python along with many of the other advanced tools of such as slots and super we briefly met the first and third of these in and respectivelythe second and fourth are largely new topics we'll explore in full here as we'll seeall four techniques share goals to some degreeand it' usually possible to code given problem using any one of them they do differ in some important waysthough for examplethe last two techniques listed here apply to specific attributeswhereas the first two are generic enough to be used by delegation-based proxy classes that must route arbitrary attributes to wrapped objects as we'll seeall four schemes also differ in both complexity and aestheticsin ways you must see in action to judge for yourself besides studying the specifics behind the four attribute interception techniques listed in this sectionthis also presents an opportunity to explore larger programs than we've seen elsewhere in this book the cardholder case study at the endfor exampleshould serve as self-study example of larger classes in action we'll also be using some of the techniques outlined here in the next to code decoratorsso be sure you have at least general understanding of these topics before you move on properties the property protocol allows us to route specific attribute' getsetand delete operations to functions or methods we provideenabling us to insert code to be run automatically on attribute accessintercept attribute deletionsand provide documentation for the attributes if desired properties are created with the property built-in and are assigned to class attributesjust like method functions accordinglythey are inherited by subclasses and instanceslike any other class attributes their access-interception functions are provided with the self instance argumentwhich grants access to state information and class attributes available on the subject instance property manages singlespecific attributealthough it can' catch all attribute accesses genericallyit allows us to control both fetch and assignment accesses and enables us to change an attribute from simple data to computation freelywithout breaking existing code as we'll seeproperties are strongly related to descriptorsin factthey are essentially restricted form of them properties
1,775
property is created by assigning the result of built-in function to class attributeattribute property(fgetfsetfdeldocnone of this built-in' arguments are requiredand all default to none if not passed for the first threethis none means that the corresponding operation is not supportedand attempting it will raise an attributeerror exception automatically when these arguments are usedwe pass fget function for intercepting attribute fetchesfset function for assignmentsand fdel function for attribute deletions technicallyall three of these arguments accept any callableincluding class' methodhaving first argument to receive the instance being qualified when later invokedthe fget function returns the computed attribute valuefset and fdel return nothing (reallynone)and all three may raise exceptions to reject access requests the doc argument receives documentation string for the attributeif desiredotherwisethe property copies the docstring of the fget functionwhich as usual defaults to none this built-in property call returns property objectwhich we assign to the name of the attribute to be managed in the class scopewhere it will be inherited by every instance first example to demonstrate how this translates to working codethe following class uses property to trace access to an attribute named namethe actual stored data is named _name so it does not clash with the property (if you're working along with the book examples packagesome filenames in this are implied by the command-lines that run them following their listings)class personadd (objectin def __init__(selfname)self _name name def getname(self)print('fetch 'return self _name def setname(selfvalue)print('change 'self _name value def delname(self)print('remove 'del self _name name property(getnamesetnamedelname"name property docs"bob person('bob smith'print(bob namebob name 'robert smithprint(bob name managed attributes bob has managed attribute runs getname runs setname
1,776
print('-'* sue person('sue jones'print(sue nameprint(person name __doc__runs delname sue inherits property too or help(person nameproperties are available in both and xbut they require new-style object derivation in to work correctly for assignments--add object as superclass here to run this in you can list the superclass in toobut it' implied and not requiredand is sometimes omitted in this book to reduce clutter this particular property doesn' do much--it simply intercepts and traces an attribute --but it serves to demonstrate the protocol when this code is runtwo instances inherit the propertyjust as they would any other attribute attached to their class howevertheir attribute accesses are caughtc:\codepy - prop-person py fetch bob smith change fetch robert smith remove fetch sue jones name property docs like all class attributesproperties are inherited by both instances and lower subclasses if we change our example as followsfor instanceclass superthe original person class code name property(getnamesetnamedelname'name property docs'class person(super)pass properties are inherited (class attrsbob person('bob smith'rest unchanged the output is the same--the person subclass inherits the name property from superand the bob instance gets it from person in terms of inheritanceproperties work the same as normal methodsbecause they have access to the self instance argumentthey can access instance state information and methods irrespective of subclass depthas the next section further demonstrates properties
1,777
the example in the prior section simply traces attribute accesses usuallythoughproperties do much more--computing the value of an attribute dynamically when fetchedfor example the following example illustratesclass propsquaredef __init__(selfstart)self value start def getx(self)return self value * def setx(selfvalue)self value value property(getxsetxon attr fetch on attr assign no delete or docs propsquare( propsquare( two instances of class with property each has different state information print( xp print( xprint( * * * ( this class defines an attribute that is accessed as though it were static databut really runs code to compute its value when fetched the effect is much like an implicit method call when the code is runthe value is stored in the instance as state informationbut each time we fetch it via the managed attributeits value is automatically squaredc:\codepy - prop-computed py notice that we've made two different instances--because property methods automatically receive self argumentthey have access to the state information stored in instances in our casethis means the fetch computes the square of the subject instance' own data coding properties with decorators although we're saving additional details until the next we introduced function decorator basics earlierin recall that the function decorator syntax@decorator def func(args)is automatically translated to this equivalent by pythonto rebind the function name to the result of the decorator callabledef func(args)func decorator(func managed attributes
1,778
to define function that will run automatically when an attribute is fetchedclass person@property def name(self)rebindsname property(namewhen runthe decorated method is automatically passed to the first argument of the property built-in this is really just alternative syntax for creating property and rebinding the attribute name manuallybut may be seen as more explicit in this roleclass persondef name(self)name property(namesetter and deleter decorators as of python and property objects also have gettersetterand deleter methods that assign the corresponding property accessor methods and return copy of the property itself we can use these to specify components of properties by decorating normal methods toothough the getter component is usually filled in automatically by the act of creating the property itselfclass persondef __init__(selfname)self _name name @property def name(self)"name property docsprint('fetch 'return self _name @name setter def name(selfvalue)print('change 'self _name value @name deleter def name(self)print('remove 'del self _name bob person('bob smith'print(bob namebob name 'robert smithprint(bob namedel bob name print('-'* sue person('sue jones'print(sue nameprint(person name __doc__name property(namename name setter(namename name deleter(namebob has managed attribute runs name getter (name runs name setter (name runs name deleter (name sue inherits property too or help(person nameproperties
1,779
an alternative way to code properties in this case when it' runthe results are the samec:\codepy - prop-person-deco py fetch bob smith change fetch robert smith remove fetch sue jones name property docs compared to manual assignment of property resultsin this case using decorators to code properties requires just three extra lines of code-- seemingly negligible difference as is so often the case with alternative toolsthoughthe choice between the two techniques is largely subjective descriptors descriptors provide an alternative way to intercept attribute accessthey are strongly related to the properties discussed in the prior section reallya property is kind of descriptor--technically speakingthe property built-in is just simplified way to create specific type of descriptor that runs method functions on attribute accesses in factdescriptors are the underlying implementation mechanism for variety of class toolsincluding both properties and slots functionally speakingthe descriptor protocol allows us to route specific attribute' getsetand delete operations to methods of separate class' instance object that we provide this allows us to insert code to be run automatically on attribute fetches and assignmentsintercept attribute deletionsand provide documentation for the attributes if desired descriptors are created as independent classesand they are assigned to class attributes just like method functions like any other class attributethey are inherited by subclasses and instances their access-interception methods are provided with both self for the descriptor instance itselfas well as the instance of the client class whose attribute references the descriptor object because of thisthey can retain and use state information of their ownas well as state information of the subject instance for examplea descriptor may call methods available in the client classas well as descriptorspecific methods it defines like propertya descriptor manages singlespecific attributealthough it can' catch all attribute accesses genericallyit provides control over both fetch and assignment accesses and allows us to change an attribute name freely from simple data to computation without breaking existing code properties really are just convenient way to managed attributes
1,780
directly unlike propertiesdescriptors are broader in scopeand provide more general tool for instancebecause they are coded as normal classesdescriptors have their own statemay participate in descriptor inheritance hierarchiescan use composition to aggregate objectsand provide natural structure for coding internal methods and attribute documentation strings the basics as mentioned previouslydescriptors are coded as separate classes and provide specially named accessor methods for the attribute access operations they wish to intercept --getsetand deletion methods in the descriptor class are automatically run when the attribute assigned to the descriptor class instance is accessed in the corresponding wayclass descriptor"docstring goes heredef __get__(selfinstanceowner)def __set__(selfinstancevalue)def __delete__(selfinstance)return attr value return nothing (nonereturn nothing (noneclasses with any of these methods are considered descriptorsand their methods are special when one of their instances is assigned to another class' attribute--when the attribute is accessedthey are automatically invoked if any of these methods are absentit generally means that the corresponding type of access is not supported unlike propertieshoweveromitting __set__ allows the descriptor attribute' name to be assigned and thus redefined in an instancethereby hiding the descriptor--to make an attribute read-onlyyou must define __set__ to catch assignments and raise an exception descriptors with __set__ methods also have some special-case implications for inheritance that we'll largely defer until ' coverage of metaclasses and the complete inheritance specification in shorta descriptor with __set__ is known formally as data descriptorand is given precedence over other names located by normal inheritance rules the inherited descriptor for name __class__for exampleoverrides the same name in an instance' namespace dictionary this also works to ensure that data descriptors you code in your own classes take precedence over others descriptor method arguments before we code anything realisticlet' take brief look at some fundamentals all three descriptor methods outlined in the prior section are passed both the descriptor class instance (self)and the instance of the client class to which the descriptor instance is attached (instancethe __get__ access method additionally receives an owner argumentspecifying the class to which the descriptor instance is attached its instance argument is either the instance through which the attribute was accessed (for instance attr)or none when the atdescriptors
1,781
these generally computes value for instance accessand the latter usually returns self if descriptor object access is supported for examplein the following sessionwhen attr is fetchedpython automatically runs the __get__ method of the descriptor class instance to which the subject attr class attribute is assigned in xuse the print statement equivalentand derive both classes here from objectas descriptors are new-style class toolin this derivation is implied and can be omittedbut doesn' hurtclass descriptoradd "(object)in def __get__(selfinstanceowner)print(selfinstanceownersep='\ 'add "(object)in descriptor instance is class attr class subjectattr descriptor( subject( attr subject attr none notice the arguments automatically passed in to the __get__ method in the first attribute fetch--when attr is fetchedit' as though the following translation occurs (though the subject attr here doesn' invoke __get__ again) attr -descriptor __get__(subject attrxsubjectthe descriptor knows it is being accessed directly when its instance argument is none read-only descriptors as mentioned earlierunlike propertiessimply omitting the __set__ method in descriptor isn' enough to make an attribute read-onlybecause the descriptor name can be assigned to an instance in the followingthe attribute assignment to stores in the instance object xthereby hiding the descriptor stored in class cclass ddef __get__(*args)print('get'class ca ( ( get managed attributes attribute is descriptor instance runs inherited descriptor __get__
1,782
list( __dict__ keys()[' ' ( get get stored on xhiding ay still inherits descriptor this is the way all instance attribute assignments work in pythonand it allows classes to selectively override class-level defaults in their instances to make descriptor-based attribute read-onlycatch the assignment in the descriptor class and raise an exception to prevent attribute assignment--when assigning an attribute that is descriptorpython effectively bypasses the normal instance-level assignment behavior and routes the operation to the descriptor objectclass ddef __get__(*args)print('get'def __set__(*args)raise attributeerror('cannot set'class ca ( ( get attributeerrorcannot set routed to __get__ routed to __set__ also be careful not to confuse the descriptor __delete__ method with the general __del__ method the former is called on attempts to delete the managed attribute name on an instance of the owner classthe latter is the general instance destructor methodrun when an instance of any kind of class is about to be garbage-collected __delete__ is more closely related to the __delattr__ generic attribute deletion method we'll meet later in this see for more on operator overloading methods first example to see how this all comes together in more realistic codelet' get started with the same first example we wrote for properties the following defines descriptor that intercepts access to an attribute named name in its clients its methods use their instance argument to access state information in the subject instancewhere the name string is actually stored like propertiesdescriptors work properly only for new-style classesso be sure to derive both classes in the following from object if you're using --it' not enough to derive just the descriptoror just its clientdescriptors
1,783
use (objectin "name descriptor docsdef __get__(selfinstanceowner)print('fetch 'return instance _name def __set__(selfinstancevalue)print('change 'instance _name value def __delete__(selfinstance)print('remove 'del instance _name class persondef __init__(selfname)self _name name name name(use (objectin bob person('bob smith'print(bob namebob name 'robert smithprint(bob namedel bob name bob has managed attribute runs name __get__ runs name __set__ print('-'* sue person('sue jones'print(sue nameprint(name __doc__assign descriptor to attr runs name __delete__ sue inherits descriptor too or help(namenotice in this code how we assign an instance of our descriptor class to class attribute in the client classbecause of thisit is inherited by all instances of the classjust like class' methods reallywe must assign the descriptor to class attribute like this --it won' work if assigned to self instance attribute instead when the descriptor' __get__ method is runit is passed three objects to define its contextself is the name class instance instance is the person class instance owner is the person class when this code is run the descriptor' methods intercept accesses to the attributemuch like the property version in factthe output is the same againc:\codepy - desc-person py fetch bob smith change fetch robert smith remove fetch sue jones name descriptor docs managed attributes
1,784
also like in the property exampleour descriptor class instance is class attribute and thus is inherited by all instances of the client class and any subclasses if we change the person class in our example to the followingfor instancethe output of our script is the sameclass superdef __init__(selfname)self _name name name name(class person(super)pass descriptors are inherited (class attrsalso note that when descriptor class is not useful outside the client classit' perfectly reasonable to embed the descriptor' definition inside its client syntactically here' what our example looks like if we use nested classclass persondef __init__(selfname)self _name name class name"name descriptor docsdef __get__(selfinstanceowner)print('fetch 'return instance _name def __set__(selfinstancevalue)print('change 'instance _name value def __delete__(selfinstance)print('remove 'del instance _name name name(using nested class when coded this wayname becomes local variable in the scope of the person class statementsuch that it won' clash with any names outside the class this version works the same as the original--we've simply moved the descriptor class definition into the client class' scope--but the last line of the testing code must change to fetch the docstring from its new location (per the example file desc-person-nested py)print(person name __doc__differsnot name __doc__ outside class computed attributes as was the case when using propertiesour first descriptor example of the prior section didn' do much--it simply printed trace messages for attribute accesses in practicedescriptors can also be used to compute attribute values each time they are fetched the following illustrates--it' rehash of the same example we coded for propertiesdescriptors
1,785
fetchedclass descsquaredef __init__(selfstart)self value start def __get__(selfinstanceowner)return self value * def __set__(selfinstancevalue)self value value class client descsquare( class client descsquare( each desc has own state on attr fetch on attr assign no delete or docs assign descriptor instance to class attr another instance in another client class could also code two instances in same class client ( client (print( xc print( xprint( * * * ( when runthe output of this example is the same as that of the original property-based versionbut here descriptor class object is intercepting the attribute accessesc:\codepy - desc-computed py using state information in descriptors if you study the two descriptor examples we've written so faryou might notice that they get their information from different places--the first (the name attribute exampleuses data stored on the client instanceand the second (the attribute squaring exampleuses data attached to the descriptor object itself ( selfin factdescriptors can use both instance state and descriptor stateor any combination thereofdescriptor state is used to manage either data internal to the workings of the descriptoror data that spans all instances it can vary per attribute appearance (oftenper client classinstance state records information related to and possibly created by the client class it can vary per client class instance (that isper application objectin other wordsdescriptor state is per-descriptor data and instance state is per-clientinstance data as usual in oopyou must choose state carefully for instanceyou would not normally use descriptor state to record employee namessince each client instance requires its own value--if stored in the descriptoreach client class instance managed attributes
1,786
use instance state to record data pertaining to descriptor implementation internals--if stored in each instancethere would be multiple varying copies descriptor methods may use either state formbut descriptor state often makes it unnecessary to use special naming conventions to avoid name collisions in the instance for data that is not instance-specific for examplethe following descriptor attaches information to its own instanceso it doesn' clash with that on the client class' instance--but also shares that information between two client instancesclass descstatedef __init__(selfvalue)self value value def __get__(selfinstanceowner)print('descstate get'return self value def __set__(selfinstancevalue)print('descstate set'self value value use descriptor state(objectin on attr fetch on attr assign client class class calcattrsx descstate( def __init__(self)self descriptor class attr class attr obj calcattrs(print(obj xobj yobj zobj calcattrs obj print(obj xobj yobj zx is computedothers are not assignment is intercepted reassigned in class assigned in instance obj calcattrs(print(obj xobj yobj zinstance attr but uses shared datalike ythis code' internal value information lives only in the descriptorso there won' be collision if the same name is used in the client' instance notice that only the descriptor attribute is managed here--get and set accesses to are interceptedbut accesses to and are not ( is attached to the client class and to the instancewhen this code is runx is computed when fetchedbut its value is also the same for all client instances because it uses descriptor-level statec:\codepy - desc-state-desc py descstate get descstate set descstate get descstate get descriptors
1,787
instanceinstead of itself cruciallyunlike data stored in the descriptor itselfthis allows for data that can vary per client class instance the descriptor in the following example assumes the instance has an attribute _x attached by the client classand uses it to compute the value of the attribute it representsclass inststatedef __get__(selfinstanceowner)print('inststate get'return instance _x def __set__(selfinstancevalue)print('inststate set'instance _x value client class class calcattrsx inststate( def __init__(self)self _x self obj calcattrs(print(obj xobj yobj zobj calcattrs obj print(obj xobj yobj zobj calcattrs(print(obj xobj yobj zusing instance state(objectin assume set by client class descriptor class attr class attr instance attr instance attr is computedothers are not assignment is intercepted reassigned in class assigned in instance but differs nowlike zherex is assigned to descriptor as before that manages accesses the new descriptor herethoughhas no information itselfbut it uses an attribute assumed to exist in the instance--that attribute is named _xto avoid collisions with the name of the descriptor itself when this version is run the results are similarbut the value of the descriptor attribute can vary per client instance due to the differing state policyc:\codepy - desc-state-inst py inststate get inststate set inststate get inststate get both descriptor and instance state have roles in factthis is general advantage that descriptors have over properties--because they have state of their ownthey can easily retain data internallywithout adding it to the namespace of the client instance object as summarythe following uses both state sources--its self data retains per-attribute informationwhile its instance data can vary per client instance managed attributes
1,788
def __init__(selfdata)self data data def __get__(selfinstanceowner)return '% % (self datainstance datadef __set__(selfinstancevalue)instance data value class clientdef __init__(selfdata)self data data managed descboth('spam' client('eggs' managed 'spameggsi managed 'spami managed 'spamspamshow both data sources change instance data we'll revisit the implications of this choice in larger case study later in this before we move onrecall from ' coverage of slots that we can access "virtualattributes like properties and descriptors with tools like dir and getattreven though they don' exist in the instance' namespace dictionary whether you should access these this way probably varies per program--properties and descriptors may run arbitrary computationand may be less obviously instance "datathan slotsi __dict__ {'data''spam'[ for in dir(iif not startswith('__')['data''managed'getattr( 'data''spamgetattr( 'managed''spamspamfor attr in ( for in dir(iif not startswith('__'))print('% =% (attrgetattr(iattr))data =spam managed =spamspam the more generic __getattr__ and __getattribute__ tools we'll meet later are not designed to support this functionality--because they have no class-level attributestheir "virtualattribute names do not appear in dir results in exchangethey are also not limited to specific attribute names coded as properties or descriptorstools that share even more than this behavioras the next section explains descriptors
1,789
as mentioned earlierproperties and descriptors are strongly related--the property built-in is just convenient way to create descriptor now that you know how both workyou should also be able to see that it' possible to simulate the property built-in with descriptor class like the followingclass propertydef __init__(selffget=nonefset=nonefdel=nonedoc=none)self fget fget self fset fset self fdel fdel save unbound methods self __doc__ doc or other callables def __get__(selfinstanceinstancetype=none)if instance is nonereturn self if self fget is noneraise attributeerror("can' get attribute"return self fget(instancepass instance to self in property accessors def __set__(selfinstancevalue)if self fset is noneraise attributeerror("can' set attribute"self fset(instancevaluedef __delete__(selfinstance)if self fdel is noneraise attributeerror("can' delete attribute"self fdel(instanceclass persondef getname(self)print('getname 'def setname(selfvalue)print('setname 'name property(getnamesetnameuse like property( person( name name 'bobdel name this property class catches attribute accesses with the descriptor protocol and routes requests to functions or methods passed in and saved in descriptor state when the class is created attribute fetchesfor exampleare routed from the person classto the property class' __get__ methodand back to the person class' getname with descriptorsthis "just works" :\codepy - prop-desc-equiv py getname setname attributeerrorcan' delete attribute note that this descriptor class equivalent only handles basic property usagethoughto use decorator syntax to also specify set and delete operationswe' have to extend managed attributes
1,790
accessor function and return the property object (self should sufficesince the prop erty built-in already does thiswe'll omit formal coding of this extension here descriptors and slots and more you can also probably now at least in part imagine how descriptors are used to implement python' slots extensioninstance attribute dictionaries are avoided by creating class-level descriptors that intercept slot name accessand map those names to sequential storage space in the instance unlike the explicit property callthoughmuch of the magic behind slots is orchestrated at class creation time both automatically and implicitlywhen __slots__ attribute is present in class see for more on slots (and why they're not recommended except in pathological use casesdescriptors are also used for other class toolsbut we'll omit further internals details heresee python' manuals and source code for more details in we'll also make use of descriptors to implement function decorators that apply to both functions and methods as you'll see therebecause descriptors receive both descriptor and subject class instances they work well in this rolethough nested functions are usually conceptually much simpler solution we'll also deploy descriptors as one way to intercept built-in operation method fetches in be sure to also see ' coverage of data descriptorsprecedence in the full inheritance model mentioned earlierwith __set__descriptors override other namesand are thus fairly binding--they cannot be hidden by names in instance dictionaries __getattr__ and __getattribute__ so farwe've studied properties and descriptors--tools for managing specific attributes the __getattr__ and __getattribute__ operator overloading methods provide still other ways to intercept attribute fetches for class instances like properties and descriptorsthey allow us to insert code to be run automatically when attributes are accessed as we'll seethoughthese two methods can also be used in more general ways because they intercept arbitrary namesthey apply in broader roles such as delegationbut may also incur extra calls in some contextsand are too dynamic to register in dir results attribute fetch interception comes in two flavorscoded with two different methods__getattr__ is run for undefined attributes--because it is run only for attributes not stored on an instance or inherited from one of its classesits use is straightforward __getattr__ and __getattribute__
1,791
be cautious when using this method to avoid recursive loops by passing attribute accesses to superclass we met the former of these in it' available for all python versions the latter of these is available for new-style classes in xand for all (implicitly new-styleclasses in these two methods are representatives of set of attribute interception methods that also includes __setattr__ and __delattr_ because these methods have similar rolesthoughwe will generally treat them all as single topic here unlike properties and descriptorsthese methods are part of python' general operator overloading protocol--specially named methods of classinherited by subclassesand run automatically when instances are used in the implied built-in operation like all normal methods of classthey each receive first self argument when calledgiving access to any required instance state information as well as other methods of the class in which they appear the __getattr__ and __getattribute__ methods are also more generic than properties and descriptors--they can be used to intercept access to any (or even allinstance attribute fetchesnot just single specific name because of thisthese two methods are well suited to general delegation-based coding patterns--they can be used to implement wrapper ( proxyobjects that manage all attribute accesses for an embedded object by contrastwe must define one property or descriptor for every attribute we wish to intercept as we'll see aheadthis role is impaired somewhat in newstyle classes for built-in operationsbut still applies to all named methods in wrapped object' interface finallythese two methods are more narrowly focused than the alternatives we considered earlierthey intercept attribute fetches onlynot assignments to also catch attribute changes by assignmentwe must code __setattr__ method--an operator overloading method run for every attribute fetchwhich must take care to avoid recursive loops by routing attribute assignments through the instance namespace dictionary or superclass method although less commonwe can also code __delattr__ overloading method (which must avoid looping in the same wayto intercept attribute deletions by contrastproperties and descriptors catch getsetand delete operations by design most of these operator overloading methods were introduced earlier in the bookherewe'll expand on their usage and study their roles in larger contexts the basics __getattr__ and __setattr__ were introduced in and and __getattribute__ was mentioned briefly in in shortif class defines or inherits the following methodsthey will be run automatically when an instance is used in the context described by the comments to the right managed attributes
1,792
on undefined attribute fetch [obj namedef __getattribute__(selfname)on all attribute fetch [obj namedef __setattr__(selfnamevalue)on all attribute assignment [obj name=valuedef __delattr__(selfname)on all attribute deletion [del obj namein all of theseself is the subject instance object as usualname is the string name of the attribute being accessedand value is the object being assigned to the attribute the two get methods normally return an attribute' valueand the other two return nothing (noneall can raise exceptions to signal prohibited access for exampleto catch every attribute fetchwe can use either of the first two previous methodsand to catch every attribute assignment we can use the third the following uses __getattr__ and works portably on both python and xnot requiring newstyle object derivation in xclass catcherdef __getattr__(selfname)print('get%snamedef __setattr__(selfnamevalue)print('set% % (namevalue) catcher( job pay pay prints "getjobprints "getpayprints "setpay using __getattribute__ works exactly the same in this specific casebut requires object derivation in (only)and has subtle looping potentialwhich we'll take up in the next sectionclass catcher(object)def __getattribute__(selfname)print('get%snamerest unchanged need (objectin only works same as getattr here but prone to loops on general such coding structure can be used to implement the delegation design pattern we met earlierin because all attributes are routed to our interception methods genericallywe can validate and pass them along to embeddedmanaged objects the following class (borrowed from )for exampletraces every attribute fetch made to another object passed to the wrapper (proxyclassclass wrapperdef __init__(selfobject)self wrapped object def __getattr__(selfattrname)print('traceattrnamereturn getattr(self wrappedattrnamex wrapper([ ] append( print( wrappedsave object trace fetch delegate fetch prints "traceappendprints "[ ]there is no such analog for properties and descriptorsshort of coding accessors for every possible attribute in every possibly wrapped object on the other handwhen __getattr__ and __getattribute__
1,793
assignments in some contexts-- tradeoff described in and mentioned in the context of the case study example we'll explore at the end of this avoiding loops in attribute interception methods these methods are generally straightforward to usetheir only substantially complex aspect is the potential for looping ( recursingbecause __getattr__ is called for undefined attributes onlyit can freely fetch other attributes within its own code howeverbecause __getattribute__ and __setattr__ are run for all attributestheir code needs to be careful when accessing other attributes to avoid calling themselves again and triggering recursive loop for exampleanother attribute fetch run inside __getattribute__ method' code will trigger __getattribute__ againand the code will usually loop until memory is exhausteddef __getattribute__(selfname) self other loopstechnicallythis method is even more loop-prone than this may imply-- self attribute reference run anywhere in class that defines this method will trigger __getattri bute__and also has the potential to loop depending on the class' logic this is normally desired behavior--intercepting every attribute fetch is this method' purposeafter all--but you should be aware that this method catches all attribute fetches wherever they are coded when coded within __getattribute__ itselfthis almost always causes loop to avoid this looproute the fetch through higher superclass instead to skip this level' version--because the object class is always new-style superclassit serves well in this roledef __getattribute__(selfname) object __getattribute__(self'other'force higher to avoid me for __setattr__the situation is similaras summarized in --assigning any attribute inside this method triggers __setattr__ again and may create similar loopdef __setattr__(selfnamevalue)self other value recurs (and might loop!here tooself attribute assignments anywhere in class defining this method trigger __setattr__ as wellthough the potential for looping is much stronger when they show up in __setattr__ itself to work around this problemyou can assign the attribute as key in the instance' __dict__ namespace dictionary instead this avoids direct attribute assignmentdef __setattr__(selfnamevalue)self __dict__['other'value managed attributes use attr dict to avoid me
1,794
assignments to higher superclass to avoid loopingjust like __getattribute__ (and per the upcoming notethis scheme is sometimes preferred)def __setattr__(selfnamevalue)object __setattr__(self'other'valueforce higher to avoid me by contrastthoughwe cannot use the __dict__ trick to avoid loops in __getattri bute__def __getattribute__(selfname) self __dict__['other'loopsfetching the __dict__ attribute itself triggers __getattribute__ againcausing recursive loop strange but truethe __delattr__ method is less commonly used in practicebut when it isit is called for every attribute deletion (just as __setattr__ is called for every attribute assignmentwhen using this methodyou must take care to avoid loops when deleting attributesby using the same techniquesnamespace dictionaries operations or superclass method calls as noted in attributes implemented with new-style class features such as slots and properties are not physically stored in the instance' __dict__ namespace dictionary (and slots may even preclude its existence entirelybecause of thiscode that wishes to support such attributes should code __setattr__ to assign with the object __setattr__ scheme shown herenot by self __dict__ indexing namespace __dict__ operations suffice for classes known to store data in instanceslike this self-contained examplesgeneral toolsthoughshould prefer object first example generic attribute management is not nearly as complicated as the prior section may have implied to see how to put these ideas to workhere is the same first example we used for properties and descriptors in action againthis time implemented with attribute operator overloading methods because these methods are so genericwe test attribute names here to know when managed attribute is being accessedothers are allowed to pass normallyclass persondef __init__(selfname)self _name name def __getattr__(selfattr)print('getattrif attr ='name'return self _name elseportable or on [person()triggers __setattr__on [obj undefinedintercept namenot stored does not loopreal attr others are errors __getattr__ and __getattribute__
1,795
def __setattr__(selfattrvalue)print('setattrif attr ='name'attr '_nameself __dict__[attrvalue on [obj any valuedef __delattr__(selfattr)print('delattrif attr ='name'attr '_namedel self __dict__[attron [del obj anybob person('bob smith'print(bob namebob name 'robert smithprint(bob namedel bob name print('-'* sue person('sue jones'print(sue name#print(person name __doc__set internal name avoid looping here avoid looping here too but much less common bob has managed attribute runs __getattr__ runs __setattr__ runs __delattr__ sue inherits property too no equivalent here notice that the attribute assignment in the __init__ constructor triggers __setattr__ too--this method catches every attribute assignmenteven those anywhere within the class itself when this code is runthe same output is producedbut this time it' the result of python' normal operator overloading mechanism and our attribute interception methodsc:\codepy - getattr-person py set_name getname bob smith setname getname robert smith delname set_name getname sue jones also note thatunlike with properties and descriptorsthere' no direct notion of specifying documentation for our attribute heremanaged attributes exist within the code of our interception methodsnot as distinct objects using __getattribute__ to achieve exactly the same results with __getattribute__replace __getattr__ in the example with the followingbecause it catches all attribute fetchesthis version must managed attributes
1,796
assume unknown names are errorsreplace __getattr__ with this def __getattribute__(selfattr)print('getattrif attr ='name'attr '_namereturn object __getattribute__(selfattron [obj anyintercept all names map to internal name avoid looping here when run with this changethe output is similarbut we get an extra __getattri bute__ call for the fetch in __setattr__ (the first time originating in __init__) :\codepy - getattribute-person py set_name get__dict__ getname bob smith setname get__dict__ getname robert smith delname get__dict__ set_name get__dict__ getname sue jones this example is equivalent to that coded for properties and descriptorsbut it' bit artificialand it doesn' really highlight these toolsassets because they are generic__getattr__ and __getattribute__ are probably more commonly used in delegationbase code (as sketched earlier)where attribute access is validated and routed to an embedded object where just single attribute must be managedproperties and descriptors might do as well or better computed attributes as beforeour prior example doesn' really do anything but trace attribute fetchesit' not much more work to compute an attribute' value when fetched as for properties and descriptorsthe following creates virtual attribute that runs calculation when fetchedclass attrsquaredef __init__(selfstart)self value start def __getattr__(selfattr)if attr =' 'return self value * elsetriggers __setattr__on undefined attr fetch value is not undefined __getattr__ and __getattribute__
1,797
def __setattr__(selfattrvalue)if attr =' 'attr 'valueself __dict__[attrvalue on all attr assignments attrsquare( attrsquare( instances of class with overloading each has different state information print( xa print( xprint( * * * ( running this code results in the same output that we got earlier when using properties and descriptorsbut this script' mechanics are based on generic attribute interception methodsc:\codepy - getattr-computed py using __getattribute__ as beforewe can achieve the same effect with __getattribute__ instead of __get attr__the following replaces the fetch method with __getattribute__ and changes the __setattr__ assignment method to avoid looping by using direct superclass method calls instead of __dict__ keysclass attrsquaredef __init__(selfstart)self value start add (objectfor triggers __setattr__def __getattribute__(selfattr)on all attr fetches if attr =' 'return self value * triggers __getattribute__ againelsereturn object __getattribute__(selfattrdef __setattr__(selfattrvalue)on all attr assignments if attr =' 'attr 'valueobject __setattr__(selfattrvaluewhen this versiongetattribute-computed pyis runthe results are the same again noticethoughthe implicit routing going on inside this class' methodsself value=start inside the constructor triggers __setattr__ self value inside __getattribute__ triggers __getattribute__ again in fact__getattribute__ is run twice each time we fetch attribute this doesn' happen in the __getattr__ versionbecause the value attribute is not undefined if you care managed attributes
1,798
fetch value as welldef __getattribute__(selfattr)if attr =' 'return object __getattribute__(self'value'* of coursethis still incurs call to the superclass methodbut not an additional recursive call before we get there add print calls to these methods to trace how and when they run __getattr__ and __getattribute__ compared to summarize the coding differences between __getattr__ and __getattribute__the following example uses both to implement three attributes--attr is class attributeattr is an instance attributeand attr is virtual managed attribute computed when fetchedclass getattrattr def __init__(self)self attr def __getattr__(selfattr)print('getattrif attr ='attr 'return elseraise attributeerror(attron undefined attrs only not on attr inherited from class not on attr stored on instance getattr(print( attr print( attr print( attr print('-'* class getattribute(object)(objectneeded in only attr def __init__(self)self attr def __getattribute__(selfattr)on all attr fetches print('getattruse superclass to avoid looping here if attr ='attr 'return elsereturn object __getattribute__(selfattrx getattribute(print( attr print( attr print( attr __getattr__ and __getattribute__
1,799
and must route those it does not manage to the superclass fetcher to avoid loopsc:\codepy - getattr- -getattr py getattr getattr getattr getattr although __getattribute__ can catch more attribute fetches than __getattr__in practice they are often just variations on theme--if attributes are not physically storedthe two have the same effect management techniques compared to summarize the coding differences in all four attribute management schemes we've seen in this let' quickly step through somewhat more comprehensive computed-attribute example using each techniquecoded to run in either python or the following first version uses properties to intercept and calculate attributes named square and cube notice how their base values are stored in names that begin with an underscoreso they don' clash with the names of the properties themselvestwo dynamically computed attributes with properties class powers(object)def __init__(selfsquarecube)self _square square self _cube cube def getsquare(self)return self _square * def setsquare(selfvalue)self _square value square property(getsquaresetsquaredef getcube(self)return self _cube * cube property(getcubex powers( print( squareprint( cubex square print( square * * * managed attributes need (objectin only _square is the base value square is the property name