Programming is the art of algorithm design and the craft of debugging errant code.
Ellen Ullman
Genetics is where we come from. It's deeply natural to want to know.
Introduced in the 1960s, multitasking is an engineering strategy for making computers more efficient. Human beings are the slowest elements in a system.
There is always one more bug to fix.
Programmers seem to be changing the world. It would be a relief, for them and for all of us, if they knew something about it.
If you've ever watched someone who is a mother talk on the phone, feed the dog, bounce the baby, it's just astounding to see someone manage, more or less well, to do all those things. But on a computer, multitasking is really binary. The task is either in the foreground, or it's not.
Productivity has always been the justification for the prepackaging of programming knowledge. But it is worth asking about the sort of productivity gains that come from the simplifications of click-and-drag.
I hate the new word processors that want to tell you, as you're typing, that you made a mistake. I have to turn off all that crap. It's like, shut up - I'm thinking now. I will worry about that sort of error later. I'm a human being. I can still read this, even though it's wrong. You stupid machine, the fact that you can't is irrelevant to me.
Writing is a very isolating occupation.
What I hope is that those with the knowledge of the humanities break into the closed society where code gets written: invade it.
Human thinking can skip over a great deal, leap over small misunderstandings, can contain ifs and buts in untroubled corners of the mind. But the machine has no corners. Despite all the attempts to see the computer as a brain, the machine has no foreground or background.
UNIX always presumes you know what you're doing. You're the human being, after all, and it is a mere operating system.
I'm a pessimist. But I think I'd describe my pessimism as broken-hearted optimism.
When I hear the word 'disruption,' in my mind, I think of all these people in the middle who were earning a living. We will sweep away all that money they were earning, and we will move that to the people at the top.
I feel the best villains are the ones you have feelings for.
When I am around people I most admire, I tend to hug the wall.
I broke into the ranks of computing in the early 1980s, when women were just starting to poke their shoulder pads through crowds of men. There was no legal protection against 'hostile environments for women.'
With all the attention given to the personal computer, it's hard to remember that other companion machine in the room - the printer.
Before the advent of the Web, if you wanted to sustain a belief in far-fetched ideas, you had to go out into the desert, or live on a compound in the mountains, or move from one badly furnished room to another in a series of safe houses.
Through the miracle of natural genetic recombination, each child, with the sole exception of an identical twin, is conceived as a unique being. Even the atmosphere of the womb works its subtle changes, and by the time we emerge into the light, we are our own persons.
Writing was a way to get away from my life as a programmer, so I wanted to write about other things, but of course nobody wanted to publish another story about a family, unless it was extraordinary. When I began writing about my life as a programmer, however, people were interested.
I like mysteries.
Computer systems could not work without standards - an agreement among programs and systems about how they will exchange information.
Computer programming has always been a self-taught, maverick occupation.
Our Constitution is designed to change very slowly. It's a feature, not a bug.
When I am writing, and occasionally achieve single focus and presence, I finally feel that is where I'm supposed to be. Everything else is kind of anxiety.
My approach to being a self-taught programmer was to find out who was smart and who would be helpful, and these were - these are both men and women. And without learning from my co-workers, I never could've gone on in the profession as long as I did.
I won't use Twitter. Twitter posts are thought-farts. I don't care about unconsidered thoughts of the moment.
A computer is a general-purpose machine with which we engage to do some of our deepest thinking and analyzing. This tool brings with it assumptions about structuredness, about defined interfaces being better. Computers abhor error.
No one in the government is seriously penalized when Social Security numbers are stolen and misused; only the number-holders suffer.
The brain is plastic, continuously changing its organization.
There's some intimacy in reading, some thoughtfulness that doesn't exist in machine experiences.
Watching a program run is not as revealing as reading its code.
You can only get a beginner's mind once.
With every advance, you have to look over your shoulder and know what you're giving up - look over your shoulder and look at what falls away.
After we have put our intimate secrets and credit card numbers online, what can prevent us from putting our elections there as well?
I think many people have wonderful stories inside them and the talent to tell those stories. But the writing life, with its isolation and uncertain outcomes, keeps most from the task.
So many people for so many years have promoted technology as the answer to everything. The economy wasn't growing: technology. Poor people: technology. Illness: technology. As if, somehow, technology in and of itself would be a solution. Yet machine values are not always human values.
Y2K is showing everyone what technical people have been dealing with for years: the complex, muddled, bug-bitten systems we all depend on, and their nasty tendency toward the occasional disaster.
The condition of my personal workspace is my own business, as I see it.
A computer is not really like us. It is a projection of a very small part of ourselves: that portion devoted to logic, order, rule and clarity.
It will not work to keep asking men to change. Many have no real objective to do so. There's no reward for them. Why should they change? They're doing well inside the halls of coding.
It had to happen to me sometime: sooner or later, I would have to lose sight of the cutting edge. That moment every technical person fears - the fall into knowledge exhaustion, obsolescence, techno-fuddy-duddyism - there was no reason to think I could escape it forever.
I like the little semi-competencies of human beings, I realize. Governance, after all, is a messy business, a world of demi-solutions and compromise, where ideals are tarnished regularly.
Staring prejudice in the face imposes a cruel discipline: to structure your anger, to achieve a certain dignity, an angry dignity.
I used to pass by a large computer system with the feeling that it represented the summed-up knowledge of human beings. It reassured me to think of all those programs as a kind of library in which our understanding of the world was recorded in intricate and exquisite detail.
With code, what it means is what it does. It doesn't express, not really. It's a very bounded conversation. And writing is not bounded. That's what's hard about it.
I fear for the world the Internet is creating.
I came of technical age with UNIX, where I learned with power-greedy pleasure that you could kill a system right out from under yourself with a single command.
Truly new inventions take time to play out.