Python: teaching kids and biting bits don't mix

Today an important thing happened in my professional life. I was told to take a break from The Top Priority Project, so that I could deal with a more important project. The evaluation of the expression max_priority+1 caused my wetware registers to overflow. Therefore, you should consider the following piece of whining as an attempt of a brain to recover from deadline damage. That is, you won't get the kind of deep discussions and intellectual insights you've come to expect from yosefk.com. Instead, you'll get shallow, plain, simple and happy Python hate. That's the kind of thing we have in stock right now.

It has been known for quite some time that a specie called the Idiot is out there, and its revolting specimens will mercilessly prey on your belief in the future of the human race. The time has come for me to share with you some of the shocking experience from my encounters with idiots. Believe it or not, there exists a kind of Idiot who sincerely thinks that a Person should not complain about Technology or Tools he or she gets to use, and that such complains are deeply wrong on this or other level, depending on the exact idiot in question. Well, I hereby inform the idiots, at least the ones who can use a browser and read text, that I (1) have a bad attitude, (2) am not a good workman, so (3) I'll complain about any technology I use as I damn please and (4) I don't give a flying fuck about your moronic opinion, so don't bother to share it with me.

So, Python hate. I don't really hate Python; it's a figure of speech. You know what I really hate? Of course you do. I hate C++. C++ is the hate of my life. I don't think I'll ever be able to hate another programming language. For me, to hate means to recognize that something is inherently evil and should be exterminated. Reaching that status is outstanding achievement for any technology. Your typical piece of software never gets that far. All it does is it does something you need most of the time, and then does something extremely brain-damaged some of the time. So you kinda like it, but sometimes it just pisses you off. It's gotta piss you off. It's a machine. It's doesn't have a pigeon crapload worth of common sense. Of course it pisses you off. Don't lie to me! A person who was exposed to machines and doesn't hate them is either an idiot or is completely devoid of soul! Step back, the child of Satan!

You know what really pisses me off about Python? It's the combination of being BDFL-managed and having roots in the noble idea of being The Language for Teaching. Sure, having lexical scoping wouldn't hurt, and having name errors 5 minutes down the road in a program that happily parses already hurts, and Python shells aren't a picnic, blah blah blah. But nothing is perfect, you know, and people adapt to anything. I learned to live in tcsh and vim as my IDE. So I know how adaptability can bring you quite far, sometimes much farther than you wish to get. But this BDFL+teaching combo really bugs the hell out of me. Allow me to elaborate.

Ever heard about programming languages with an ANSI standard? Sure you did. Well, the other kind of languages have a BDFL standard. It's when a Benevolent Dictator For Life, typically the Punk Compiler Writer (PCW) who invented the language and threw together the initial (and frequently still the only practical) implementation, decides what happens with the language. I plan to blog about PCWs at some point, but this isn't about PCWs in general, this is about PCWs who've been elevated to the BDFL rank. So they bend the language as they damn please. Sometimes it splits the user community (as in Perl 5 vs Perl 6) and sometimes it doesn't (as in Python 2 and Python 3, or so it seems). I'd say that it's totally stupid to use a BDFL-governed language, but I can't, because that would offend C++, The Hate Of My Life, who does have an ANSI standard. Relax, darling. It's you, and only you, that I truly hate. The others mean nothing to me.

So that's what the "BDFL" part means. The "teaching" part is about Python being supposed to be a good (the best? of course, the best) language for teaching kids how to program. While I'm not sure whether the BDFL part is inherently wrong, the teaching part is sure as hell wrong, in two ways:

  1. Why the fuck should kids program?
  2. Why the fuck should I program in a language for kids?

I didn't program till the age of 17, and I have absolutely no regrets about it. I know quite some other programmers-who're-really-into-software-someone-call-for-help who didn't hack in their diapers, either. I also know a bunch of people who did program since they were 10 to 12 years old. They are all burnt out. They're trying to look for some other occupation. Theater, psychology, physics, philosophy, you name it. I haven't gathered that many data points, just a couple dozens. Some people won't fit the pattern I see. But it's enough for me to assume that kids could have better things to do than programming. Watching cartoons is one thing that sounds like fun. That I always liked.

I'm not sure kids shouldn't program. We need scientific data on that one. I'm no specialist, but locking a large enough set of kids in the basement and have them implement progressive radiosity sounds like a good start. Anyway, as you probably noticed, while I'm curious about the scientific truth, I don't care much about kids. I care about me.

Me. I'm a professional programmer. By which I mean to say, I shovel through piles of virtual shit for a living. And to shovel quickly and happily, I need a big shovel. Python is one of my shovels. Core dumps are one of my piles of shit. Python, meet the core dumps. Core dumps, meet the Python.

Core dumps are spelled in hexadecimal. I mean "core dumps", "spelled" and "hexadecimal" in the broadest sense, as in "things having to do with machine encoding and that sort of low-level crud are viewed and edited in tools that tend to emit and consume integers in hexadecimal notation". Or "core dumps are spelled in hexadecimal" for short.

To deal with hexadecimal, Python has a hex() function and a '%s' format string like it should. Well, almost. Python thinks that hexadecimal is like decimal, except in base 16 instead of 10. Integers are naturally spelled in sign magnitude format: 1, 10, -1, -10. In hexadecimal, that's 0x1, 0xa, -0x1, -0xa. "-0x1"? "-0x1"?!! FUCK YOU, PYTHON!!

You see, all computers that have been out there since the year 19-something store integers in 2's complement format. There used to be 1's complement and sign magnitude machines, but they are all dead, which is a good thing, because 2's complement is a good thing. In 2's complement, -1 in hexadecimal is 0xffffffff. If you use 32-bit integers. And twice the number of f's on 64-bit machines. Do I give a flying fuck about 64-bit machines? I think you know the answer. If I cared, I'd run Python on a 64-bit host. An old version of Python.

In old versions of Python, hex() worked about right. Python has ints and longs (that's fixnum and bignum in Lisp, and it would be int and BigInteger in Java if ints didn't overflow but would instead turn into BigIntegers when int wouldn't have enough bits). hex() assumed that if you're using ints, you're a bit biter, and you want to get 0xffffffff for -1, and you should get your non-portable, but handy result. If, on the other hand, you used longs, you probably were a mathematician, so you'd get -0x1. And this was juuuust riiiight.

However, starting from version 2.something, hex() was "fixed" to always do "The Right Thing" – return the portable sign magnitude string. That, my friends, is what a kid would expect. You should know – you were a kid yourself, didn't you expect just that?! Naturally, breaking the frigging backwards compatibility is perfectly OK with the PCW BDFL, who sure as hell doesn't want to add another function, sex() for "sexadecimal". That function could (preferably) do the new thing, and hex would do the old thing to not break the existing programs. Or it could (passably) do the old thing so that people could at least get the old functionality easily, even though hex() would no longer work. But noooo, we need just one hex function doing the one right thing. Too bad we only found out what that right thing was after all those years went by, but in retrospect, it's obvious that nobody needs what we've been doing.

Now, could anybody explain me the value of hexadecimal notation outside of the exciting world of low-level bit fucking? Why on Earth would you want to convert your numbers to a different base? One particular case of this base brain rot is teaching kids to convert between bases. As Richard Feynman has pointed out, and he had the authority to say so and I don't and hence the reference, converting between numbers is completely pointless, and teaching kids how to do this is a waste of time. So what you do is you give me, the adult programmer with my adult core dump problems, a toy that a kid wouldn't need or want to play with. Thank you! You've found the perfect time, because I have a DEADLINE and this fucking shit CRASHES and I couldn't possibly have BETTER THINGS TO DO than WASTING TIME ON CONVERTING BASES!!

I know that I can implement the right (yes, THE RIGHT, damn it) hex and put it into my .pythonrc and that would solve the interactive shell annoyances and I could carry it around and import it and that would solve the non-interactive annoyances, thanks again. Until now I've done it in a cheaper way though – I had a Python 2.3.5 binary handy, and that did the trick. 2 Pythons, one with new libraries and shiny metaprogramming crap and stuff, and one with a working hex(). I like snakes.

Why do I attribute this business to kid-teaching, of all things? I don't lurk on Python forums, so I'm basing this on rumors I heard from die-hard Python weenies. Another, TOTALLY INFURIATING thing: a/b and a//b. int/int yields a frigging FLOAT in Python 3!! This has all the features of hex(-1) == '-0×1':

  • "Kids" get the wrong answer: 3/2 should give 1 (honest-to-God integer division) or it should give the ratio 3/2 (honest-to-God rational/real number division). Floating point is reeeaaally too complicated. I've seen a teacher saying online something along the lines of "if you ask how much is 3/2 times 2, how do you grade the answer 2.99999"? Bonus points if your reply managed to take the identity "2.999… = 3" into account (hordes of grown-ups fail to get that one).
  • "Programmers" get the wrong answer: chunks=len(arr)/n; rem=len(arr)%n. 'nuff said.
  • "Mathematicians" get the wrong answer: I mean, if you're doing numeric work in Python (what's the matter, can't pay for a Matlab license?), you probably know about integer division versus floating point division, but you've never heard about an operation called // (no, it's not a BCPL comment). 3/2=1.5? How am I supposed to do integer division? int(3/2)? Argh!

You know, I'm actually more willing to upgrade to C++0x, whenever that materializes, than I am willing to upgrade to Python 3. At least it would be "for(auto p=m.begin()…" instead of "for(std::map<std::string,std::string>::const_iterator p=FUCK OFF AND DIE YOU EVIL PROGRAMMING LANGUAGE FROM HELL!!".

What's that? Who said I liked C++0x more than I liked Python 3?! Upgrade, I said, "willing to upgrade"! Don't you see it's relative to the previous version? In absolute terms, who can beat C++?! Come back, C++! I still hate you!!

26 comments ↓

#1 hiffy on 04.09.08 at 6:02 pm

That, sir, was inspired.

#2 taw on 04.23.08 at 5:48 am

I wanted to flame you and tell you that you should have used "%08x" % -1, which would obviously do the right thing, but funnily it doesn't, printing lame "-0000001" instead.

What kinda made me agree with your point, as sprintf("%08x", -1) definitely works in Ruby, doing exactly what it looks like it's doing, and is also portable.

#3 Yossi Kreinin on 04.23.08 at 11:25 am

Yep, in Ruby 1.8.6, '%x'%-1 returns '..f', which is probably the best thing you can do – 2's complement, and doesn't depend on any particular data type size.

The thing that gets me in Python is that it worked passably. That they broke it is annoying, that they did it for the completely wrong reasons doesn't make it any better.

#4 Asaf Bartov on 04.30.08 at 2:31 am

hiffy is right. An inspired rant!

And as long as we're on the subject of teaching kids to program (though I fully agree with you about it not being necessary, as about teaching 'em to convert between bases), the (in)famous Rubyist _why is working on a cute programming environment for beginners called Hackety Hack, which is worth looking at, I think.

Also, thanks for the C++ FQA! It's great to finally have a good site to link to when I want to save my breath explaining the evils of C++ to some benighted colleagues.

#5 Yossi Kreinin on 04.30.08 at 10:36 am

Regarding the "inspired" bit – um, thanks, it was primarily inspired by overtime…

As to C++ FQA – I hope it's any good as persuasion ammunition. It turns out I can't test it myself very much, because when someone asks you, "what's wrong with constructors?", and you say, "read this page I wrote about it", most of their attention gets occupied by the thought: "hey, this guy really IS a nerd". Oh well. Maybe it works better when I'm an anonymous third party in the argument.

#6 alexandre on 06.07.08 at 11:11 am

Interesting rant, and given that you picked up only on minor warts, I am tempted to say that you like Python. :-)

However, I think you are overemphasizing a bit the "teaching language" roots of Python. It is true that ABC (Python's direct ascendant) was originally for teaching, but this link between the two languages was cut long ago. I never yet seen the addition of a language feature being refused (or accepted) on the basis it would make the language harder (or easier) to learn for the kids.

And about the "true division" thingy, I can't tell whether this was a good thing or not. The only thing I can said it that changing the division operator was controversial and not an easy decision to take.

#7 Yossi Kreinin on 06.07.08 at 11:39 am

I do like Python, in the sense that I think it does much less harm than good. Basically the quality scale for programming languages is:

1. So bad that you must rewrite your legacy code in something else ASAP (COBOL? Assembly? Maybe not. Maybe there are no languages which are that bad.)
2. Too bad for new code, not bad enough to justify rewriting existing code (C++)
3. Good enough to write new code in, and good enough to be chosen as the language for new projects for availability reasons (for example, if everybody around speaks Python, it's a good enough reason to use it and not Ruby, even though the latter is apparently somewhat better linguistically).
4. So wonderful/easy to learn that you want to use it for new code even though it isn't widely spoken in your environment and has other availability problems (Lisp? D?)

Most popular languages happen to fall between 2 and 3, and Python gets a solid 3 in my book, and 3.5 in a 3GL environment when you want to advocate a 4GL. So yeah, I like Python.

#8 alexandre on 06.08.08 at 10:10 pm

Personally, I haven't done much programming in C++ to judge it (although, I do find your rants and FQA about it much entertaining). One thing I find funny though, is that almost every year it's a C++ team that wins the ICFP programming contest. For example, last year Team Smartass (a team of Googlers) won the first prize and made the organizers declare:

C++ is the programming language of choice for
discriminating hackers.

And, the second prize was won by another team of C++ coders (United Coding Team), but they made the organizers declare:

Perl is a fine tool for many applications.

That is pretty ironic given that the contest's main goal is to raise awareness in functional programming languages. The write-up of the contest is quite interesting, too. Hopefully, this year contest will be more challenging than "code a VM".

I don't know why people always compare Ruby to Python, and vice versa. Ruby's design philosophy is much closer to Perl's than Python's. Rubyists, just like Perl hackers, aim for a syntactically powerful language at the cost of ambiguity. Pythonistas, on the other hand, aim for readability at the cost of some rigidness in the grammar (e.g. Python will probably never move beyond the current extended LL(1) parser, where Ruby needs a quite complex LALR(1) parser). That's where perhaps your analogy with kids language is somehow true.

I agree that Lisp is a nice and powerful language. However, I think the community has severe problems (especially when they come to marketing the language), which makes the language a bad choice for larger projects. However for small project done with friends who knows the languages well, Lisp is certainly a great choice (a fun one, too).

Anyway, I am really intrigued about D. I have heard many clever people advocating it as an alternative to C++, but I had never yet tried it. So, I took a few hours today, installed both DMD and GDC and read some of documentation posted on D website.

So far, I like it! It is an interesting blend between simple C and high-level scripting languages. Although the libraries seems a bit immature (yet quite complete for a young language). I will probably try give it a better "test-drive" next weekend (I am thinking implementing a small interpreter with it).

P.S. I remarked that you're using an old version of WordPress. In my humble opinion, that is a bad idea. About a month ago, some attacker successfully exploited a PHP injection vulnerability in WordPress 2.3 and gained shell access to my server account. Thankfully, the attacked didn't do much damage — i.e., he simply appended some spam junk to all my HTML pages, which I cleaned easily with one sed command.

#9 Yossi Kreinin on 06.09.08 at 4:00 am

"Team Smartass": damn, I really envy those people. Winning a competition as a member of Team Smartass. Is that awesome or what?

I don't follow these competitions, because of not liking programming competitions and because of not caring that much about FP. The reason that mainstream non-FP languages frequently win those competitions is IMO that the users of those languages have more practice. A C++ loop iterating over a map is ugly, but I've typed it a gazillion times and I can do it again really fast, because I use it all the time. A professional prehistorical warrior with a club has a good chance to take out an amateur shotgun enthusiast. Miss a few times and you'll be clubbed to death.

I don't think Ruby is that close to Perl, except for the occasional dollar sign. I think Ruby and Python are the 2 top 4GLs if your metric is popularity * cleanliness, and they use very similar dynamic hashtable-based REPL models, so it's natural to compare them. I think Perl 5 is way more complex syntactically than either of them. Regarding parsing – I don't care whether it's LL or LR, as long as it's context free (C++ has significantly lowered my standards for "ease of parsing"). I don't think Ruby tries to do any "DWIM" parsing similarly to Perl – or does it? Which ambiguities are you talking about?

D rules.

Finally surrendered to my growing paranoia and upgraded to WP 2.5.1, surprisingly painlessly.

#10 AlSweigart on 07.01.08 at 10:26 am

1. Why the fuck should kids program?

Because programming can be a fun, creative task that at the same time teaches them disciplined logical reasoning skills. And some kids will like making their own Tetris clone.

2. Why the fuck should I program in a language for kids?

Python is great because it has a gentle learning curve (making it a great first language) while at the same time scales up to be a professional language in most software domains.

I really advocate Python, as a language for kids, non-programmers, and programmers. It has the things I like about Perl without the things I don't like about Perl (mostly). In fact, I wrote a book (Invent Your Own Computer Games with Python) specifically to teach kids programming in Python (can we agree that it's the best real-world language for that task at least?)

I'll take on the two flaws you bring up about Python. I think the hex() function should behave the way it does. It simply has the job of converting integers to hexadecimal. Two's complement is an idea that is beyond a mere base conversion. It would bring up all sorts of overflow problems and architecture-specific concerns (32-bit two's complement or 64-bit?)

I understand your gripe, but it seems that you are upset that a design trade off was made against your favor. I don't think that's a fatal flaw for a language.

The division thing has always been a clusterfuck. I forget which video, but I recall Guido van Rosem saying he regretted the whole integer division thing, which is probably why in int/int = float in Python 3000.

But I still don't think it is too complicated. Explain it as such: if you divide 13 by 5, you get 2 with a remainder of 3. Integer division gives you the 2 part, mod gives you the 3 part. To get real division, convert an operand to a float. Sure, that's dumb (hence the fix in Python 3), but again, I don't see that as a fatal flaw in the language.

#11 Yossi Kreinin on 07.01.08 at 11:53 am

Let's put it this way: I sincerely like Python, I think kids should program if they wanna, and the whole thing about hex and division isn't a big deal.

I still say that base conversion is a useless operation while 2's complement hex is useful. And if base conversion is considered interesting, add a base() function for it, but why break hex() that did the useful thing for lots of time?

Let's agree that Python does have an annoying vibe of teaching you the right thing. Consider the interactive behavior of exit or quit. The damn things aren't strings anymore; they are objects with overridden __repr__ telling you that you should type Ctrl-D!! WTF? Why not sys.exit(0) instead of fucking my brain?

On the other hand, it doesn't matter. For example, I like the attitude behind Perl more than the attitude behind Python (of course I refer to those attitudes as I perceive them). But I prefer to use Python rather than Perl. Creators and creations aren't trivially "similar".

#12 divmod on 10.31.08 at 8:07 pm

> chunks=len(arr)/n; rem=len(arr)%n

Ugh. Why not
chunks, rem = divmod(len(arr), n)
?
I think that still does int division in python 3.

#13 sauceror on 01.07.09 at 6:02 am

Thing is we have a language for kids. It's called BASIC. It's what I learned on when I was 8, got tired of when I was 12 and moved on to scarier languages derived from C. Now I'm moving on to LISP derived languages.
To paraphrase,
knowledge of the intricacies of C++ (and similar) should be treated like a martial art. First try to run away. Failing that, call the authorities. Your last resort should be to stand your ground and code.

#14 Harald Striepe on 05.27.09 at 5:51 pm

I think the old Dartmouth BASIC number is a really good starting point, as is learning assembler on a simple, old architecture like 6502. It's all about totally understanding all there is to know about an environment, then using those tools to analyze a problem, and experimenting with finding the solution that is verifiable, unlike your English or social science paper.
If they like it, most kids will move on to other languages, but they will have a basic understanding that is hard to gain from more sophisticated languages with large libraries. Sort of like doing math problems in your head and long hand versus a calculator.

#15 Yossi Kreinin on 05.27.09 at 11:44 pm

I think assembly is a bad way to model computation, whether you're a beginner or not, both because of its verbosity (as in function calls/definitions or intermediate results of arithmetic expressions) and its fixed-size limits (as in the number and to a lesser extent size of registers.) IMO many a beginner would take away a feeling that programs are verbose legalese. A group of beginners would love assembler as their first language because they're into making watches out of a thousand tiny moving parts; but those people will find their way to assembly or similar anyway.

#16 kedra marbun on 04.13.10 at 1:18 am

i'm just starting to learn py, so i can still clearly remember that my brain rejected -0×1. the int div scared the hell out of me, & half the hell when i knew about //

to the writer of "Invent your own computer games with python", i don't give a fuck about whether they're complicated / not, what i complain is they made my brain created new connections between my brain cells, instead of reinforced existing ones. so as neuroscience goes, using them weakens the network that let me know that -1 is 0x(f) in hex

'Why the fuck should I program in a language for kids?'. this shakes my pride & stomach

#17 hauptmech on 06.18.10 at 4:50 am

Loved the rant.

#18 Yossi Kreinin on 06.18.10 at 6:51 am

Are you this hauptmech? Interesting stuff.

#19 dumpster on 06.19.10 at 7:24 pm

I want to like python. I had hoped to use it for all my new cgi scripts but it took me forever to get used to the indentation.
I was and still am used to curly braces. I still prefer plain old C to any of the new so called "easy" languages.

#20 Yossi Kreinin on 06.20.10 at 12:28 am

Well, if you prefer plain old C for CGI scripts (or should I say programs), this leaves me speechless (I've seen people who did everything including "scripting" in C++, also people who did everything in C+bash+grep+awk+…; everything in C I hadn't seen).

#21 Craig the Teacher on 10.30.11 at 6:42 am

I am currently teaching a middle school child to program in Palo Alto, CA. The standard progression of languages in our area for children is MIT Scratch, CMU Alice, then Java or Python. Many children are learning Java for historical reasons. For the children learning Python, many take advantage of Pygame, which provides 2D graphics based on SDL.

My general view is that technology and art are merging. I don't think we are teaching children programming for them to be professional programmers. I believe we are teaching them programming so that they are better prepared to express their creativity in a world that is becoming increasingly influenced by technology.

Let's say that my students grow up to be environmental scientists or pastry chefs. I believe that an understanding of programming will help them in both of these jobs. Waiting to 17 years old is too late. By this age, the children are too busy preparing for college and may not have time to experiment with learning things on their own. I want the children to be quite proficient in programming by this age so that they can then go back to studies such as literature or art.

For now, I teach the creation of simple arcade-style games, trying to unlock the creative potential of children with sound, music, 2D graphics, lists, for loops, and coordinate arithmetic.

In the same way that we teach math without any expectation that the child will become a professional math professor, we should not teach programming to children with the intent of training professional computer programmers.

Our goal here is to simply provide children the opportunity to make their own choices. Without the tools of knowledge, opportunities may close on them.

#22 Yossi Kreinin on 10.30.11 at 8:17 am

@Craig: it's definitely an inspiring approach. A bit sadly though, as a professional programmer who occasionally does some painting and sculpting, I was never able to interestingly "merge" programming and art; nor was programming useful for anything other than earning a salary, for that matter.

Personally, I explain this partly by my own interest in meta-programming/platforms/etc. rather than end-user programming, which obviously isn't applicable outside of a development context – and partly on programs being a bit like movies: theoretically, you're just making a bit string with readily available tools, but in practice, it's not at all easy to produce a notable result on a small budget (both in terms of time and money).

So if anyone successfully applies programming in the context of any sort of art, I think it's rare and notable.

P.S. As to my rant – it was, well, a rant… So this is my answer to you – but your comment wasn't "an answer to me" in the sense that your comment was serious, this comment of mine is serious, too, but whatever claims I made in the original rant probably aren't…

#23 Martin on 01.11.12 at 1:15 am

Started out inspired but it rants on. I could not agree more however. The best language ever, and I really mean "Ever" was Pascal.

The compiler was unforgiving and did a magnificent job of protecting the end consumers from (well you know…) IDIOTS. The final code was elegant, efficient, fast and above all: Reliable. (Idiot programmers need not apply)

There was absolutely nothing that Pascal could not do that "C" could, but "C" was far more egalitarian and IDIOT tolerant, ergo its popularity. C was also much less readable.

Then came the stupidest idea ever conceived since G.W.B. called Object Oriented Programming, giving rise to C++ and Delphi Object Pascal. Unfortunately, because OOP is necessary under GUI OS's such as Windows, C++ became king and Delphi still has a following amongst Pascal lovers such as my self.

The touted benefits of OOP = Code reuse, encapsulation and inheritance are greatly over-rated.

1) Code re-use is available in all languages … its called cut-paste-and-tweek.

The problem with OOP, is that in order to make computer code re-use friendly, you have to bloat its complexity. I just love searching through a 1047 layer class hiearchy in pursuit of the proverbial "beef"

2) Object inheritance: Ditto.

3) Encapsulation = PASCAL units and ADT's do this exceptionally well, with far superior rules of scope, compared to C/C++

Unlike objects which are dynamic, pascal records and ADT procedures/functions can be statically implemented
i.e. faster, more efficient and less likely to crash and burn at run time due to buffer overflows, memory corruption, leaks and fragmentation.

4) Poly morphism = no problem with PASCAL procedure/function OverLoad.

5) Sophisticated and custom data types = No problem in pascal with strong type checking, unless you specifically and deliberately re-cast it.

6) Recursion = no Problem

7) Dynamic structures, no problem, without the dangerous pointer arithmetic often abused by C
programmers.

8) Much more platform independent.

etc….

Like I said, there is absolutely nothing that Object Pascal/Delphi cannot do, that C++ can, but the Pascal code is much more likely to be stable.

Then came JAVA, which I call C++++++ for idiots.

And finally PYTHON ??!!!

Named after the proverbially stupid and maladjusted SULTAN of RIDICULOUS … Monty???!
—-
Python is a BASIC/LIKE Object Oriented kids' toy on steroids:

0) 100% OOP
1) Dynamically typed (only)
2) Indenting instead of {} or begin/end statements ?!
that may save a few chars, it it can really byte you
the ass later.
3) No type checking
4) Interpreted, no compiler ergo no compiler time checking.
5) No variable or type declarations !!!
6) Inefficient, slow, Buggy

Just because the code does NOT crash as often as C++ (because of internal memory management) does not mean that the program actually doing what is supposed to do in the first place.

Python, encourages the kind of mindless careless, goal oriented programming typical of teenagers without methodology or formal education. IT's great for writing phone-apps I suppose.

Now, to my horror, they are teaching this crap to my kid in first year engineering.

The ratio of IDIOTS to HUMANS is fixed, but the damage
done by IDIOTS is exponential. Since the population is increasing, we are doomed.

#24 N. Jost on 07.06.12 at 8:41 pm

I agree that the current funcionality of bin() and hex() functions are quite strange and useless. But this is only a really tiny minor aspect of the language, so do not judge the entire thing based only on this. There are third party library to deals with low-level data, so bitching about that be in the python-core is just a waste of time.

About the integer division, dude, that is another minor change: now you must use // for emphatizing floor division. No need for casting! And I think is great, so that floor division is now a real operation, most consistent than have to do tricks like 5/2. to get "correct" 2.5 float answer.

I believe that programming is a really useful skill to have, like or even more than math. And to make more brain-functional human beings, is not bad at all to learn it early in life. But Python certainly isn't just about that. It was, of course, designed to be clean and easy to learn, but things evolved to something much bigger than that.

You people should realize that judging the whole language based only in one or two minor aspects its like to defaming a person based on his big nose.

#25 Yossi Kreinin on 07.07.12 at 10:03 pm

I like Python; it was just a rant.

That said, Python 3 is IMO a train wreck, kind of, precisely because of wishing to "do things right" where it isn't even that clear that the new way is right.

#26 Danny on 12.07.12 at 11:35 pm

Martin:

>The best language ever, and I really mean "Ever" was Pascal.

For 1990, it was great, yes (Delphi, too) :-)

>C was also much less readable.

Agreed. When I first saw C I thought it was some kind of a joke or historical artifact. Turns out people still use it and it's not that bad once you develop some habits in order to taper over (the many) problems of the language.

>Unlike objects which are dynamic

Objects in C++ or Delphi are not dynamic. Objects in Smalltalk, LISP, Python, Ruby, … are.

The original idea of objects was (ask Alan Kay) that an object is something you can pass a message (a string) to and it does what you told it to. That's all.

Dynamic languages actually do it that way and you can actually write a procedure that gets a string as argument which contains the message passed (this procedure "is" the object). When you write obj.foo it will call getattr(obj, "foo") automatically. When you write obj.bar it will call getattr(obj, "bar").

C++ (somewhat on purpose) didn't do that (they wanted raw performance, so there it's an integer offset, not a string) and so an object in C++ and Delphi looks like a struct with "private variables in the interface/header file" (not very private, is it?) which is the one thing it wasn't meant to be (it's supposed to be a procedure, not a struct).

About Python:

>0) 100% OOP

Yeah, and the weird struct kind, mostly. At least the REPL doesn't print the member variables when you print the object, that would have shown total ignorance.

In defense of Python, in Python 3 you can actually write dynamic objects (i.e. procedures), too.

>1) Dynamically typed (only)
> 2) Indenting instead of {} or begin/end statements ?!
that may save a few chars, it it can really byte you
the ass later.

In the uncountable years I use it indentation didn't bite me in the ass even once. This is a complaint of people who didn't try it. Seriously, you are indenting anyway, in any language. Might as well skip the braces.

(alternative: skip the indentation and keep the braces, i.e. write the entire program in one line – not good)

(redundancy is not an alternative)

> 3) No type checking

I don't know where that myth comes from. If you were in the Python REPL for 1 minute, you'd see it does type check. Python is strongly typed.

>>> 1 + "hello"

TypeError: unsupported operand type(s) for +: 'int' and 'str'

> 4) Interpreted, no compiler ergo no compiler time checking.

There are compilers (Shedskin etc) and they do check at compile time (try it, it has very annoying error messages, almost as annoying as Pascal compilers').

> 5) No variable or type declarations !!!

Thank god. What are they for?

This is really an assembler mindset. I don't care how it stores the values. Don't make me write int v = 5 like an idiot. What is 5? An integer. So what is v? Yep. No need for boilerplate stating the obvious.

That said, there are and always have been two schools of thinking and dynamic language users are just different in that they can't abide boilerplate for any reason.

If you want to write a formal proof, though, you need the boilerplate – otherwise because of Gödel's Incompleteness Theorem it's impossible to prove what your program does without running it.

(I don't know a single programmer who writes a formal proof for all his programs, though)

yosefk:

>"Kids" get the wrong answer: 3/2 should give 1 (honest-to-God integer division)

That would be wrong (as in: different from any class in school I ever had in my life).

>or it should give the ratio 3/2 (honest-to-God rational/real number division).

I agree, that would be sensible.

>Floating point is reeeaaally too complicated.

It's fine as a compromise.

>"Programmers" get the wrong answer: chunks=len(arr)/n; rem=len(arr)%n. 'nuff said.

chunks, rem = divmod(len(arr), n)

This is not C. You are allowed to have "multiple return values" (*cough*).

Otherwise chunks=len(arr)//n; rem=len(arr)%n if you want to be verbose (ugh).

>"Mathematicians" get the wrong answer: I mean, if you're doing numeric work in Python (what's the matter, can't pay for a Matlab license?),

I don't know your background but we physicists and mathematicians use Python all the time (it's in the damn curriculum of the university!). Half the push comes from us to make a computer actually compute, you know, mathematical formulae without making mistakes like 1/2 == 0 (give me a break…).

Projects like sympy (for symbolic computation) had to have weird prolog initialisation which unbreaks things like this.

So it's (partly) our fault that you were angry. Hate us :-)

Though if you are programming a computer and not doing it for maths, what exactly are you using it for?

>you probably know about integer division versus floating point division

Know, yes. If any language ever bothers me with silent truncation, it gets a black mark.

input()/input()
1
2
0

It better be kidding me…

>How am I supposed to do integer division? int(3/2)?

Yes. If you want to destroy information, state that you want to destroy information. This is not PHP.

How you are supposed to do integer division: 3//2

Also, Python has one of the nicest upgrade paths I've ever seen in any software product:

For any new feature, they wait for the next major version (it's a new major version, all bets are off) in order to make it standard.

But they introduce it in the __future__ module some years (!) before it becomes standard.

As for hex(), how did it know how many bits it should use for your two's complement in the first place? Did it guess?

I didn't even know that Python uses two's complement for storing integers (does it?).

Leave a Comment