Wednesday, February 26, 2025

Great Artists Don’t Steal — They Originate

Stuart K. Hayashi



Steve Jobs was wise in many ways but, to his eternal discredit, he popularized one of Silicon Valley’s most nauseating clichés and misconceptions. “Good artists copy, great artists steal.” Jobs misattributed this saying to Pablo Picasso. The expression itself does not come from Picasso, though he did say something similar. The painter Françoise Gilot, a colleague to that well-known cubist, remembers him saying, “When there’s anything to steal, I steal.”

The expression actually comes from the poet T. S. Eliot, though he qualified the expression more than Jobs did. The quotation is found in one of Eliot’s essays in his volume The Sacred Wood where he reviews literary scholar Alfred Cruickshank’s monograph on playwright Philip Massinger. Cruickshank argues that Massinger very artfully pays homage to conventions set by William Shakespeare. Eliot disagrees, considering Massinger’s work a low-quality pastiche of the bard. In that context, Eliot says,
One of the surest of tests is the way in which a poet borrows. Immature poets imitate; mature poets steal; bad poets deface what they take, and good poets make it into something better, or at least something different. The good poet welds his theft into a whole of feeling which is unique, utterly different from that which it was torn; the bad poet throws it into something which has no cohesion. A good poet will borrow from authors remote in time, or alien in language, or diverse in interest. [George] Chapman [poet, playwright, and professor of Stoic philosophy] borrowed from [ancient-Roman Stoic philosopher] Seneca; Shakespeare and [poet John] Webster from [seventeenth-century French philosopher Michel de] Montaigne. . . . Massinger, as Mr. Cruickshank shows, borrows from Shakespeare a good deal [although, unlike Cruickshank, Eliot argues that Massinger did a poor job of it; emphasis added].
Despite Eliot’s use of the word steal, Eliot’s original quotation does concede some importance to originality after all. That concession is removed from Steve Jobs’s iteration. That makes Steve Jobs’s the dumbed-down version. To quote Eliot myself, in this instance Jobs is being a “bad poet.” Jobs borrowed from Eliot and, in the process, he defaced what he took. Jobs’s version is somewhat “different from that which it was torn,” but it is the opposite of “better.”

Altogether, “good artists copy, great artists steal” is said today with insinuations that are both misleading and cynical. For that reason, this cliché deserves to go out of circulation. It is especially disheartening that the cliché is most popular in Silicon Valley, a place once associated with progress and innovation — neither of which can exist if there is no originality.

The immediate question to address is why I would object to this cliché when the steal is not meant to be taken literally. Steve Jobs did not mean that he engaged in literal theft. Nor is that meant by the many Silicon Valley engineers who recite the cliché. But consider what the cliché implies. It mentions only two categories of artist, “good” and “great.” It says good artists “copy,” meaning that originality on their part is to be snubbed. But what about the other category, the “great artists”? In contrast to the “good artists,” the “great artists steal.”

Even as the cliché is said with one’s tongue in one’s cheek, the intention is still to presume that originality must not be considered as a possibility. That conspicuous omission is similar to the cliché of people saying that what defines your personality as an adult was either your “nature,” meaning inborn biological drives, or “nurture,” the influence that people other than yourself had exerted upon you in the past. The fact that you have free will, and that your personality as an adult is primarily the result of whom you choose to be, is not acknowledged as an option in “nature versus nurture.” Likewise, the upshot of “Good artists copy, great artists steal” is that either you are unoriginal or you are unoriginal, and there is no other alternative.

The insinuation is that originality is nonexistent or, at best, overrated. That insinuation is frequently cited as a rationalization for Silicon Valley companies gaining reputations for having invented particular new product features themselves when they had taken those features unacknowledged from elsewhere.

One might try to make the excuse that in saying “Good artists copy, great artists steal,” one is merely trying to be cute and clever and not trying to belittle originality. According to this excuse, the importance of originality is common knowledge — so much so that there is no need to acknowledge it. Hence, goes the rationalization, when someone says, “Good artists copy, great artists steal,” it causes surprise exactly because most listeners would expect to hear about artists being original, only for the expected consideration to be omitted. But that rationalization does not withstand scrutiny. In that sort of phrasing, the implication is still the same as T. S. Eliot’s “Immature poets imitate, mature poets steal.” The insinuation is that all artists have but two choices that are, respectively, 1) use the work of others in a mediocre fashion or 2) the use the work of others in a fashion that is brilliant. The insinuation is still to de-emphasize originality and its stature.

That brings to mind that, besides the denial of originality overall, there is a second, somewhat-different-but-still-related interpretation of “Great artists steal.” Those who whitewash the second interpretation say that the cliché is not about denying originality completely but instead about stressing the success of entrepreneurs who apply a “Second-Mover Approach to Innovation” rather than a “First-Mover” approach. The interpretation is that when a business financially succeeds in acquainting its customers with a relatively new category of product, often that successful business is neither the technology’s inventor nor even the first business to place that technology on the market.

On that interpretation, the point of “Good artists copy, great artists steal” is not to deny that, in the case of personal computers, the graphical user interface and the mouse were invented by one innovative party, but to stress that it was the second mover, Steve Jobs at Apple Computer, and not the original inventor who gained business success from these developments.

There is much that is wrong even with that second interpretation of “great artists steal,” and its fallacies are derivative of the fallacies in the first interpretation. For most of this essay, I will focus on the originality-denial interpretation. Near the close of this essay I shall return to discussing second interpretation, of the “second-mover advantage.”

And, when it comes to the first interpretation, there are many people who call themselves free-market libertarians who take the “Good artists copy, great artists steal” rationalization even farther.

Since the 1970s, the party line of the libertarian movement, in contrast to the Objectivist movement, has been to denigrate intellectual property rights (IPRs). It started with economist Murray Rothbard saying that patents are State-enforced monopolies of an industry and therefore should not exist. Then Wendy McElroy and Samuel Edward Conkin III added that the same accusation applies to copyrights on literature and other forms of art.

Excepting the Objectivist movement, the opposition to intellectual property rights continues to this day. And, as we shall see later, there are even a few self-described Objectivists who deny originality. The only differences among libertarians in opposition to IPRs is that there are factions who exploit different tactics in undermining them. The more openly-radical anarcho-“capitalists,” such as those at Liberty International (formerly the International Society for Individual Liberty, ISIL); Auburn, Alabama’s Ludwig von Mises Institute; and New York’s Foundation for Economic Education (FEE), say outright that they want IPRs abolished.

However, more-cautious people at libertarian think tanks in Washington, D.C., and Virginia, such as the Reason Foundation, know that such fanaticism is unpersuasive to most people, unpersuasive even to YouTube vloggers who rant about big media companies accusing their YouTube vlogs of copyright infringement. Hence, the more-cautious libertarians of D.C. think tanks have a different approach. They seldom say outright that they want IPRs eliminated altogether. Instead, they watch for high-profile court cases that they anticipate will set precedents on where particular boundaries will be placed when it comes to how courts ascertain whether one party violated another’s intellectual property rights. Any time the court case may set a precedent that will weaken, in future cases, the ability of plaintiffs to enforce their copyright, the D.C. libertarian think-tanks send out essays arguing for that weakening.

In their arsenal, such libertarians have myriad rationalizations for mischaracterizing IPRs as oppressive. I have refuted most of those rationalizations here.

Among the rationalizations is the following. It is that it is presumptuous and false for some party to claim to be the objectively discernible originator of some technological breakthrough or artwork. Some of the IPR-haters say that many parties simultaneously invented the innovation at the exact same time, which means no one party can rightfully claim to be the originator. Ergo, the argument goes, patents and copyrights should not exist.

Although he is not a libertarian and does not go as far as pronouncing that IPRs should be abolished, one very famous writer to make this case about “simultaneous invention” is Malcolm Gladwell.

And that argument — including Malcolm Gladwell’s rendition of it — smacks around a straw man. A U.S. utility patent does not claim ownership over a general idea for a general category of product, such as “paperclip.” Rather, the U.S. utility patent is on a specific presentation, delineation, or application in the function that is often within an already-existing category of product. That is why, from 1867 to 1957, the U.S. Patent-and-Trademark Office granted at least 17 U.S. utility patents to at least 16 different parties for paperclips. It is also why each of these U.S. utility patents was granted prior to the expiration of the previous one.


Nor is it true — contrary to Malcolm Gladwell — that multiple parties arrive at the exact same invention at the exact same time. Rather, what happens is that within a relative short duration from one another, separate parties each independently arrive at general ideas that are similar. There have always been differences, however, in the specifics of how each party implements that general idea or general principle. When these separate parties litigate their IPRs, the dispute is over the areas where their different presentations and designs overlap one another. The traditional solution has been for these separate parties to take these areas of overlap and pool them into a single joint trust. Even if there are similar inventions wrought from five separate parties, it does not follow that the inventions should be in the public domain. If these five parties, each currently antagonistic toward the others, can each make a valid case to have contributed to invention, it does not follow — contrary to the Rothbardians’ assumptions — that parties outside of these five are entitled to unauthorized duplication of the new presentations and designs at the expense of the R-and-D performed by the five inventive parties.

Hence, if these five parties can each honestly claim to have contributed to invention, the fact that their number exceeds one still fails to invalidate the properness of enforcing patents.

I have addressed, in greater detail, the “government-enforced monopoly” falsehood and the “simultaneous-invention” red herring over here.

And when it comes to the attempt to deny originality itself, the falsehood is worse still. The attempt to deny originality is a rationalization cited to counter the fact that when separate parties arrive at similar delineations and designs, they can simply pool their patents. The originality-deniers proclaim that the identities of the originators of an innovation cannot be isolated just to five parties, or even twenty parties, or even 135 parties. Rather, continues the false premise, innovations just come down to general “ideas,” and general “ideas” are just in-the-air. According to this fallacy, the specific originators of an innovation cannot be identified objectively, and so, for all intents and purposes, control over specific original and practicable demand-satisfying designs and presentations should be in the public domain and not isolated to the specific parties that invested their scarce resources into the R-and-D that led these parties to find the most practicable such design. You can see such an insinuation over here.

And the cliché of “Good artists copy, great artists steal” is sometimes cited in the delivery of that fallacious rationalization. Even when “good artists copy, great artists steal” is not said explicitly, the popularity of that cliché reinforces, in the minds of many computer-coding libertarians, acceptance of the rationalization.

That fallacy relies on the false conflation of general “ideas” versus the actual very-specific and detailed presentations and designs that are subject to patent and copyright. I explain that here.

However, here I discuss the “great artists steal” cliché because it has harmful effects that go even beyond the attempts to rationalize the violation of intellectual property rights. There are three types of lies with which “Good artists copy, great artists steal” is associated when it is uttered. They are:
  1. Originality, in general, is overrated. It is less important than our culture makes it out to be. 
  2. No human creation is truly original anyway. 
  3. Specific parties that have contributed to innovation — which, by implication, means that they exercised originality — cannot be objectively identified. This premise presumes that every attempt to credit some party as a great originator and innovator is a false pretense. It is a false pretense, we are to believe, from the poisonous ideology of individualism. It is similar to the “Great Man Theory of History,” the conviction that history is driven by the pivotal choices of specific individuals rather than by the collective of everyone in general and nobody in particular. [The denial of the fact that, yes, history is largely driven by pivotal choices of specific individuals, should be a topic for another time.]
Contradicting each of those three points respectively, the realities are these.
  1. The phenomenon of human originality still does not receive enough due credit in any culture. Even cultures reputed to be more-individualistic, such as that of the United States, are still guilty of giving short shrift to the importance of human originality. The popularity of the cliché “Good artists copy, great artists steal,” in a region and industry once celebrated from innovativeness, itself exemplifies that. 
  2. History is full of examples of specific individuals being original. 
  3. Specific individuals who contributed to innovation — and therefore exercised originality — can be objectively identified. Indeed, there are many historical examples of this. Yes, there are many attempts to discredit the fact that specific originators have been identified properly. These attempts are similar to the red herring of “simultaneous invention.” Such attempts include the denial of the originality of the Wright brothers when it comes to their airplane. These discreditations, also, attempt to obscure the distinction between general “ideas” versus the specifics of presentation-and-design to which patents and copyrights apply.
The essay you are now reading, attempts to address all three of the above points at least to some degree. However, this essay will be mostly about Point Number 2. This essay will provide case studies of originality and progress in art. We focus on art because it is a discipline whose history is commonly and falsely cited as proof that originality is an illusion — that everything credited as “original” and innovative turns out not to be. 

 Though this essay is a defense of artistic originality and — this should never have been necessary — and an acknowledgment of artistic originality’s existence and importance, I must also spell out what this essay is not. I am not arguing that originality is the be-all end-all of art. I am not arguing that originality is the most important aspect of art. 

This is not to say that the identifiable first-ever usage of a style or device in an artwork is the best instance of it —far from it. There are works of art that are a historic “first” in terms of a particular style or device being used, and yet do not use that style or device as effectively as do much-later works.  There are also  instances where an artwork is a historic “first” and yet otherwise does not provide any especially moving emotional experience. By the same token, there are great artworks that, though showing a significant degree of originality, were made by artists who had “originality” relatively low on their list of priorities. As long as they do not partake in outright plagiarism, I do not want any artists to debilitate their own efforts out of worry over how derivative versus how original their works are in comparison to others’.

But what I am saying is that originality does exist, it is important, and our culture should cease in its demeaning of it in the form of such clichés as “Great artists steal” and “Nothing in art has ever truly been new.” The prevalence of those clichés indicate not that our culture has too much reverence for individualistic originality but too little respect for it.




Intellectuals Quoting Prominent People — Even Inventors — That Artistic Originality Is a “Myth”
You might be surprised what sort of prominent people have offered their own spins on the rationalization that downplays originality. It comes even from prominent people who, in contexts where more was at stake financially, jealously and rightfully defended their own IPRs. Alexander Graham Bell, for instance, is justly renowned for having invented the electric landline telephone, only to be smeared by Rothbardian libertarians who accuse him falsely of having covered-up the “simultaneous invention” of phones with rival engineer Elisha Gray, scientist Philip Reis, and even fraudster Daniel Drawbaugh. Those accusations were soundly refuted in the biography by Robert V. Bruce, who had won the Pulitzer for another history of science. Despite that fact, and despite Bell having properly upheld his patents in courts, there have been other contexts in which Bell himself fallaciously repeated the originality-deniers’ favorite talking points.

A very intellectual blog that has frequently expressed sympathy for the originality-deniers’ position approvingly quotes Bell on this topic. Given Bell’s experiences pertaining directly to questions of originality and of IPRs, Bell’s own originality-denialism must seem compelling. In a letter to Helen Keller’s teacher Annie Sullivan, Bell professes,
Our most original compositions are composed exclusively of expressions derived from others. . . . Our forms of expression are copied — verbatim et literatim — in our earlier years from the expressions of others which we have heard in childhood. It is difficult for us to trace the origin of our expressions because the language addressed to us in infancy has been given by word of mouth, and not permanently recorded in books so that investigators — being unable to examine printed records of the language addressed to us in childhood — are unable to charge us with plagiarism. We are all of us however, nevertheless unconscious plagiarists, especially in childhood. As we grow older and read books the language we absorb through the eye, unconsciously affects our style.
Someone else who should have defended the importance of originality was Mark Twain. Not only was he an author but also an inventor with several U.S. utility patents to his name. That same intellectual blog happily quotes Twain saying,
As if there was much of anything in any human utterance, oral or written, except plagiarism! The kernel, the soul — let us go further and say the substance, the bulk, the actual and valuable material of all human utterances — is plagiarism. For substantially all ideas are second-hand, consciously and unconsciously drawn from a million outside sources, and daily used by the garnerer with a pride and satisfaction born of the superstition that he originated them; whereas there is not a rag of originality about them anywhere except the little discoloration they get from his mental and moral caliber and his temperament, and which is revealed in characteristics of phrasing. When a great orator makes a great speech you are listening to ten centuries and ten thousand men — but we call it his speech, and really some exceedingly small portion of it is his. . . . It takes a thousand men to invent a telegraph, or a steam engine, or a phonograph, or a telephone or any other important thing — and the last man gets the credit and we forget the others. He added his little mite — that is all he did. These object lessons should teach us that ninety-nine parts of all things that proceed from the intellect are plagiarisms, pure and simple; and the lesson ought to make us modest. But nothing can do that [emphases Twain’s].
In the title of the intellectual blog post that quotes those words is even the phrase “The Myth of Originality.” And the same blog smugly quotes Tropic of Cancer author Henry Miller in declaring,
And your way, is it really your way? . . .

What, moreover, can you call your own? The house you live in, the food you swallow, the clothes you wear — you neither built the house nor raised the food nor made the clothes. . . .

The same goes for your ideas. You moved into them ready-made.
Note that Henry Miller’s argument is a variation on President Obama’s “You Didn’t Build That.” Houses, food, and clothing are produced through a market-based division of labor. That is not a social collective but the sum of many different individuals’ actions. Similar to the originality-deniers’ argument, You-Didn’t-Build-That takes the fact that many people cooperated in helping an entrepreneur succeed and then tries to “spin” that in proclaiming that the entrepreneur’s success was due not to the entrepreneur’s choices primarily but instead to everyone in general and nobody in particular. But as I have pointed out before, the entrepreneur already did pay everyone who helped her succeed, and she does have a record of the specific individuals who contributed to her success. That record is called the entrepreneur’s “payroll.”

Likewise, as we do have a record of innovations that, until specific points in history, were completely unprecedented. Also likewise, when it comes to relatively recent innovations, there are times when we can ascertain reliably the identity of who originated what.

Henry Miller’s argument relies on a Stolen Concept. For over 90 percent of human history, there was no method to obtain food except hunter-gathering. There was no agriculture. For a particular region, someone had to be the first to have the idea to plant seeds and grow crops. Some ancient person was the first to have the idea of putting an animals’ pelt on one’s own body to keep warm. Some ancient person had to be the first to have the idea that instead of relying on a cave to protect from the elements, one could create a makeshift shelter in the wilderness, the first huts.

The fact that humans are capable of artistic originality, to me, is so obvious that it breaks my heart that I find it necessary to write an essay to explain this. To give you an idea of why I find it important, I will recount some conversations that show how trendy it is for intellectual people deny the fact of originality.




The Fact of Artistic Originality Denied By People Claim to Love a Book that Glorifies Originality
To show why I find this topic quite pressing, I want to tell of some of my experiences grappling with these ideas. A real eye-opener for me, when it comes to recognizing the significance of originality, was when I read The Fountainhead. This book, as well as Atlas Shrugged, are all about innovation. Whereas most of the other architects of the early twentieth century insist on making modern buildings resemble those of the past — Ralston Holcombe with his Renaissance style, and the builders of the Aquitania Hotel going with the Gothic style — Howard Roark takes a new path. Roark is not the first of the school of modern architecture, but he is one of its early practitioners. This new school of thought is consistent with a scientific appreciation for the world itself, applying the principle enunciated by the real-life modern-architecture pioneer Louis Henri Sullivan, “Form must follow function.” And as demonstrated by the novel’s climax, Roark will go to great lengths to protect the integrity of the very specific designs through which his original ideas find their implementation.

The emphasis in The Fountainhead on the importance of originality is corollary both to its explorations of human individualism and of technological progress.

Dictionary.Com says that to innovate is “to introduce something new; make changes in anything established.” To the extent that innovation and “progress” refer to unprecedented improvements in the procedures by which human beings carry their affairs, innovation and progress are predicated upon originality. Originality refers to initiation and initiative. As innovation and progress refer to a series of beneficent changes, that series would not commence if not for that first change, that first step, like a first ancient mutant fish stepping onto shore. The Totality of Existence did not need some supernatural entity, one external to all of Existence, to bring Existence itself into being. But everything of human affairs within Existence does need to be set in motion by a mover, a Prime Mover.  If the transmission of ideas is a river, the waterflow had some source . . . the fountainhead. 

In The Fountainhead, Howard Roark is not the first modern architect, but he appreciates the fact that he had beneficent forebearers, such as his mentor Henry Cameron, who did start modern architecture. Likewise, the innovative designer Frank Lloyd Wright did recognize and appreciate the fact that modern architecture did have a first practitioner, possibly his mentor Louis Henri Sullivan.

Absent of the innovation’s origin — the manifestation of someone’s originality — there is no innovation and no technological progress. To say that there is innovation and technological progress but that there has never been originality — and, sadly, there are some who say that — is to commit what Ayn Rand identifies as the Fallacy of the Stolen Concept (“stolen,” in this context, being the mark of a low-quality artist).

And just as beneficent progress presupposes originality, originality itself presupposes the psychological individualism that informs Howard Roark’s individualistic ethics. If someone in the arts or the design field merely copies the achievements of others, that copycat is relying on the judgment of those others more than her own. And the result will be that everything goes unimproved. Hence, the degree to which some party in arts and the design field is able to to devise some improvement in methodology is the degree to which that party had deviated from the strict emulation of others. That is, the extent to which that party devised an improvement in method is the extent to which that party had applied independence of thought to the matter. Most acts of psychological independence do not result in its practitioner gaining an insight that had been without precedent in all of human history. But in every instance where someone did piece together an insight unprecedented in human history, it was, by definition, formed — and identified as such — only through that person’s exercise of psychological independence.

Such psychological independence does not entail someone behaving strangely just for the sake behaving strangely and inviting others to gawk. If a man insisted on going out with his pants on backward, that would not be following convention, but it is doubtful that this would represent a great advancement in culture. Hence, The Fountainhead also satirizes the counterfeit individualism of pretentious hipsters. By contrast, as one thinks independently in drawing from the facts, there are moments when this independent thinker will notice and acknowledge something that other experts in the discipline either have not noticed or at least have not acknowledged sufficiently. In such cases, it is good for that thinker to bring attention to that which deserves more consideration. In the 1800s when most educated men were still Young-Earth Creationists, Charles Darwin showed important initiative.

As he himself publicly acknowledged in writing, Charles Darwin was nowhere near to being the first person to present evidence for the general phenomenon of biological evolution — not even the first in his own family.  As far back as ancient Greece, the philosophers Anaximander and Empedocles anticipated aspects of modern evolutionary theory, the former going as far as saying humans descended from a fish-like creature. During the Aristotle-influenced Islamic Golden Age, there were even Muslim philosophers who foresaw attributes of it. And much of the case that Darwin made had relied upon discoveries from other scientists, such as the geologist Charles Lyell.  Yet such people who theorized over biological evolution had still been fewer than a thousand in number, possibly even fewer than a hundred. 

Even considering the many forebearers, it was still through an act of psychological independence that Darwin accepted the facts. Both he and Alfred Russel Wallace made an advancement — an evolutionary leap in science itself — by explaining not merely the general idea of biological evolution but the specific mechanism by which it occurs: the process of natural selection. And that someone other than Darwin had also pieced-together the fact of natural selection, does not, contrary to Malcolm Gladwell, discredit the fact of originality. 

Had Henry Miller been right that all ideas are — in his words — “ready-made,” then it would mean that Darwin and Wallace deserve no more credit for explaining natural selection than did all of their contemporaries who did not describe, or conceive of, natural selection. And it would mean that these two men do not deserve any more credit for discovering natural selection than did people, such as Bishop Wilberforce, who actively denied the reality of biological evolution. Moreover, Darwin demonstrated evolution’s reality to a degree that even Alfred Russel Wallace did not. Darwin’s intellectual journey is a real-life example of the sort of phenomenon that The Fountainhead dramatizes.

Though it has been said that Steve Jobs himself admired Ayn Rand’s writings, Roark himself has some choice words for those who snicker that good artists copy and great artists steal. In the script of the movie adaptation starring Gary Cooper, also written by Rand, Roark says,
Man cannot survive except through his mind. He comes on earth unarmed. His brain is his only weapon. But the mind is an attribute of the individual. There is no such thing as a collective brain. The man who thinks must think and act on his own. . . .The creator stands on his own judgment; the parasite follows the opinions of others. The creator thinks; the parasite copies.
Roark expresses the same idea in the original novel. But this time, his phrasing is more like T. S. Eliot’s than Steve Jobs’s:
We inherit the products of the thought of other men. We inherit the wheel. We make a cart. The cart becomes an automobile. The automobile becomes an airplane. But all through the process what we receive from others is only the end product of their thinking. The moving force is the creative faculty which takes this product as material, uses it and originates the next step. This creative faculty cannot be given or received, shared or borrowed. It belongs to single, individual men. That which it creates is the property of the creator. . . .

Nothing is given to man on earth. Everything he needs has to be produced. And here man faces his basic alternative: he can survive in only one of two ways — by the independent work of his own mind or as a parasite fed by the minds of others. The creator originates. The parasite borrows.
Note that Ayn Rand does not say that an innovation comes fully formed completely out of nowhere like Athena out of the head of Zeus. Roark says very plainly that, of course, innovators typically learned from the well-demonstrated developments of their forebears. What is of pertinence to this discussion, though, is that the innovator then takes the “next step.” And as the innovator pays well-earned gratitude toward those forebears, the new “step” taken by the innovator is to be attributed properly to, and rightfully owned by, that innovator herself. That “next step” does not belong to the forebears who inspired and taught the innovator as, by definition, those forebears did not take that “next step.” Nor can that “next step” — nor the fruits of it — be rightfully credited to everyone else in society who expressed no more than a mild interest in contributing directly to the innovator’s pursuit of that innovation. It was the innovative party that took that next step; all the other people did not. That is why the innovative design or presentation that the rational faculty “creates” is rightfully “the property of the creator.”

That innovation occurs in a nexus of both familiar and unfamiliar ideas is an observation that goes back at least as far as Aristotle in his treatise on plays and drama, Poetics. Poetics itself might be an innovative “first” — as I type this, scholars commonly regard it as the oldest-known work on literary criticism and literary analysis.

Aristotle’s explanation is as follows. For something to interest you, it has to have a fortuitous balance of elements that are familiar and those that are unfamiliar. If something is completely familiar to me, then there is nothing left for me to learn about it, and therefore I am too bored to bother with it. By the same token, if something is completely unfamiliar, then it seems not to have anything to do with me either, and therefore I need not bother with that.

But that dynamic shifts with the proper balance of familiarity with the strange. That there are aspects of something that are familiar means that it pertains to me and might address concerns I already have. Additionally, that there are aspects that are alien means that there is still more to learn. Put another way, I can take a new perspective on something I had previously thought I knew much about, and then examine it from a new angle. That there is some familiarity which is incomplete, is what draws my interest. In turn, that the alien elements might shine additional light on what I had thought was familiar, sustains that interest. This can also be phrased as a balance of the known (familiar) with the unknown (alien).

That need to have a balance between the familiar and the strange extends to whether a new invention will be properly adopted by consumers and satisfy their marketplace demand. Inasmuch as the invention has features that are familiar, I can understand the invention’s purpose and can learn how I may use it to address concerns that were already plaguing me. Conversely, the aspects of the new invention that are unfamiliar and unprecedented create the possibility either
  1. that the invention can solve for me a problem that previously could not be solved; or
  2. that the invention can mitigate a problem of mine to a degree that is greater than it was mitigated by prior attempts at solutions that had been tried.
That can be seen with baby-boomers and Generation X, already familiar with landline telephones, adopting usage of mobile phones, a technology pioneered by engineers like Martin Cooper. The mobile phones had attributes in common with the landline phones with which baby-boomers and Generation X were already familiar. Like landline phones, mobile phones had keypads of buttons labeled by Arabic numerals from 0 to 9. As with landline phones, users would push these buttons to input the numbers of parties they intended to call. And again as with landline phones, users would put their ears next to a part called a “receiver,” from which the user would hear speech from the speaker on the line’s other end, and speak into another part called a “transmitter.”

However, landline telephones were connected by landlines running through houses and other buildings. To use the landline telephone, one would be confined to a particular location. The landline phone’s transmitter converted speech and other sounds into electrical signals sent through the landline. At the receiver end, the electrical signals are converted back into sounds. What was new about mobile phones were that they did not require the landlines. When someone spoke into the transmitter, the sound was converted into signals sent instead through the electromagnetic spectrum all around us, the airwaves. This novel, alien aspect enabled consumers to carry on their phone conversations absent of the previous limitations imposed by geography.

Note that acknowledgment of, and appreciation for, Martin Cooper’s innovation here does no injustice to the forebears on whose work he had built, such as Alexander Graham Bell. Contrary to Mark Twain’s presumption, to recognize that originality on Martin Cooper’s part has contributed to mobile phones emerging on the market when and how that they did — that this feat required some thoughts that were actually unprecedented — is not to downplay, deny, or “forget” predecessors from whom Martin Cooper had learned. Nay, it is simply to appreciate the aspects of this development on those like Martin Cooper who undertook intellectual integrations and actions that were new, novel, and indeed unprecedented.

Had there been nothing unprecedented in the works and thoughts of the engineers who pioneered in developing the first mobile phones, then mobile phones never would have found their way into households. All the electric telephones in existence would still only be landline ones. That would be the case if Henry Miller were right that the “ideas” of all people — including the engineers behind the earliest mobile phones — were “moved into... ready-made.” In 1955, the schematics needed to produce a practicable working mobile phone for consumers were not “ready-made.”

The principle applies in every instance when the marketplace is hit with a new technology that dramatically changes much of human activity. And it applies even when the exact parties who originated that new technology cannot be identified with great confidence. There was a time when no humans had automobiles. Now we do have automobiles. At some point, for this to have happened, someone had to think some specific thoughts that no one in the past had ever conceived before. Someone had to have performed some specific actions that no in the past had ever undertaken before. Amid these developments, Carl Benz deserves acclaim for his three-wheeled motorized carriage, and Gottlieb Daimler has earned some fame for his four-wheeled creation as well.

In contrast to Mark Twain’s straw man, celebrating originality is not about giving short shift to the innovator’s predecessors, contemporaries, and competitors. Rather, celebrating originality is to acknowledge the fact that there are points in history where people are indeed capable of thinking thoughts and accomplishing feats that are unprecedented. These are thoughts and feats that are, in contrast to everything that human beings have already thought and done earlier, markedly distinguishable to an extent that is decisive.

With respect to how that which is interesting and innovative combines elements that are familiar and conventional with those that are novel, this is how Aristotle’s Poetics puts it,
The perfection of style [in poetry, drama, and theatre] is to be clear without being mean [ordinary, mundane]. The clearest style is that which uses only current or proper [normal] words; at the same time it is mean [has some plainness]… The diction, on the other hand, is lofty and raised above the commonplace which employs unusual words. By unusual, I mean strange (or rare) words, metaphorical, lengthened — anything, in short, that differs from the normal idiom. Yet a style wholly composed of such [unfamiliar] words is either a riddle or a jargon [meaning confusing]… A certain infusion, therefore, of these elements [both the familiar and the strange] is necessary to style; for the strange (or rare) word, the metaphorical, the ornamental, and the other kinds above mentioned, will raise it above the commonplace and mean [mundane], while the use of proper [ordinary, everyday] words will make it perspicuous [easy to understand]. But nothing contributes more to produce a cleanness of diction that is remote from commonness than the lengthening, contraction, and alteration of words. For by deviating in exceptional cases from the normal idiom, the language will gain distinction; while, at the same time, the partial conformity with [everyday] usage will give perspicuity [clearness]. The critics, therefore, are in error who censure these [novel] licenses of speech, and hold the author up to ridicule. 
That last line applies more broadly to our discussion. Those who cite the familiar and conventional elements of innovative designs and innovative presentations are lacking in perspective when they cite the conventionalities to downplay the importance of the aspects that are novel and unprecedented.

The twentieth-century industrial designer Raymond Loewy had his own manner of phrasing it. He deciphered that for the marketplace to adopt a new design, it should be “MAYA — Most Advanced [the unfamiliar side] Yet Acceptable [the familiar side].”

For these reasons, the point of Howard Roark’s that I quoted earlier is quite straightforward. But, to many people, it isn’t — even among some who have gushed about how much Ayn Rand and The Fountainhead have inspired them.

I will tell you of two exchanges I had.

Years ago, tired of rationalizations for originality-denial, such as those from Rothbardians and that intellectual blog to which I linked above, I took to Twitter. I tweeted,
“Good artists copy, great artists steal.”
—Insipid cliché

“Great artists originate.”
—Me
I anticipated that I might get a response from a churlish apologist of originality-denial. And I did. But it was not in the exact form I expected. It came from someone named Andrew. We had not met face-to-face, but we did become a bit acquainted online. Around the middle of the twenty-first century’s first decade, Andrew had started one of the earliest blogs about Ayn Rand’s Objectivist philosophy. He had read Ayn Rand’s books, such as The Fountainhead, and expressed that they inspired and informed him. Surely, I had thought, someone who publicly expressed such devotion to The Fountainhead and Ayn Rand’s ideas would understand the importance of originality, especially with how it relates to individualism, psychological independence, and progress.

Well, as it turned out, no.

Presuming that I had never encountered it before — as opposed to the thousands of times that I had — Andrew proceeded to attack the usual straw man, the usual misrepresentation of what a defender of originality actually says. Andrew proceeded to recite how it is true how great artists steal, because no work of greatness emerges ex nihilo — the Latin expression what his exact words. Instead, as he recited in his whacking of the straw man, even in works of Shakespeare that are celebrated as original there is use of established conventions, or at least conventions that were in new in Shakespeare’s time but which Shakespeare himself did not invent. What made Andrew’s mansplaining worse was that I had my real-time notifications on.

Exactly as Andrew was lecturing me condescendingly, Twitter had notifications appear on the screen showing the whole tweet. It began “1/3,” indicating that this was merely the start of a series. Not only was Andrew reciting the usual straw man at me, but it would not stop at one tweet; it would be a whole series.

Sick of this, I very bluntly tweeted at Andrew that I wanted him to spare me the straw man; no defender of originality says — or even implies — that any artwork or invention appreciated as novel just emerged ex nihilo, absent of some more-conventional features pertaining to the preexisting context that innovation was made to address. Nor does acknowledging the existence and importance of originality hinge upon the notion that works of originality emerge ex nihilo free of influence from any more-conventional predecessor.

To that, Andrew expressed his annoyance and tweeted “Calm down.”

I tweeted, less sharply than before, that I wouldn’t.

Andrew then tweeted some onomatopoeia conveying that his face fell flat on a desk in vicarious embarrassment for me, and said that he doesn’t have time for close-minded and “judgmental” people like me.

I expected some defensiveness on Andrew’s part, as that is common for Twitter and Facebook. But I did not anticipate, or initially recognize, the serious degree to which I had angered him.

Thinking this over for some minutes, I tweeted to Andrew that I apologize about my choice of words in my immediate reaction but that I maintain that, due to its harmfulness in how it misleads people, expressions of approval for the “great artists steal” cliché should always be reproached.

I did not receive a reply. And some years later, I learned why.

To the extent that the originality-deniers acknowledge that arousal and innovation come from stimuli that combine both familiar and unfamiliar elements, the originality-deniers place emphasis only on the part that is familiar. To the degree that the originality-deniers acknowledge that there is novelty to something, they downplay it as nothing more and nothing better than an Emergent Property to have arisen from what is truly notable: the familiar elements. But to value originality is not to place, as they do, the bulk of the credit on the familiar elements. To value originality is to place attention on, and thus develop appreciation for, the Emergent originality itself.

According to Andrew, Shakespeare could not have created something beautiful and new had Shakespeare not also used literary devices that were not his own invention. Hence, the originality-denier concludes, it is true that “great artists steal.” But we should consider how the word steal is used here. Even as we recognize that this does not refer to a literal theft that should be punished by law, the “steal” is not flattering to Shakespeare or anyone else. By definition, it implies that Shakespeare does not have complete rightful “ownership” over what has previously been credited as his innovativeness. 

By contrast, to recognize originality is to face the allegedly-“just cheeky” label of “stealing” as the belittlement that it is. It is from Shakespeare’s clever combination of familiar elements in a new context that Emerges a form of aesthetic experience that is indeed novel — full stop. No part of the emergent novelty need be diminished by any accusation of “stealing” — not even a mischievous use of that term. Instead, from the novelty of it, Shakespeare earns full “ownership.” Period. To acknowledge and appreciate the existence, occurrence, and importance of creative originality logically requires no more justification than that.

There is an epilogue to the story of Andrew. On Facebook I had been communicating with many people who claimed to have been inspired by Ayn Rand and Objectivism, but I was disturbed by some very creepy cliques that had formed among them. One clique agreed with MAGA and explicitly with all of the associated white supremacism, somehow trying to cite Ayn Rand as if she would have approved of all of that. The other clique was of Robert F. Kennedy, Jr.-types who preached against vaccines, GMOs, and any other well-corroborated scientific findings that contradicted their Paleo Diet and other favorite fetishistic beliefs. Because life is too short for all of their melodrama, I Facebook-blocked a lot of members of these two cliques.

Much after my Twitter exchange with Andrew, I had been shown very disturbing screen shots. Members of both the white-supremacist clique and health-fad clique had come together and formed a whole club dedicated to airing grievances against me for Facebook-blocking them. I was shown one Facebook thread in particular. In retrospect, it might sound unsurprising that white nationalists and RFK Jr.-types would be getting together, as both groups are prominent in MAGA. But this alliance occurred before the discovery of COVID-19 and before vaccine-denialism became a prominent part of the MAGA movement. Amid the grievances against me from the RFK Jr.-types and the white nationalists, in that thread I saw one familiar name in particular. It was Andrew. Andrew said he was relieved that other Objectivists have learned the unflattering truth about Stuart. He mentioned having learned first-hand that Stuart is an “ass,” and, upon learning that, he blocked me immediately on Twitter.

That is why he never saw my apology to him. And this is the guy participating in this hate-fest about how I am the one who is too quick to write other people off. Upon seeing that, I can tell you that far from being made to feel remorse over having Facebook-blocked these cliques, I was reminded of why I had Facebook-blocked them in the first place. And I had much less reason to feel sad about Andrew having blocked me over Twitter.

I don’t know if Andrew is still interested in Ayn Rand’s writings. But I do know that this guy knew how to hold a grudge. That exchange over Twitter haunted me as well — it continues to do so — but for reasons different from Andrew’s. I had been blissfully under the impression that someone enthused and gushing over Ayn Rand and the The Fountainhead would have enough appreciation of originality’s importance to understand how demeaning it is when people smugly repeat “Good artists copy, great artists steal.”

That even people who proclaim love for Howard Roark have such little respect for his defense of originality, was, for me, a rude awakening.

But Andrew would say I’m just rude.

And that was not a fluke. Not long after — though it did not result in as much long-term animus — I had yet another, similar encounter.

There was a raspy-voiced European whom I had met face-to-face in Hawaii and with whom I continued to communicate over Facebook. He, too, said he was inspired by Ayn Rand and was all about free-enterprise philosophy. But, creepily, this European wanted me to be open-minded toward Murray Rothbard and his anarcho-“capitalism.” Being swept under the rug was acknowledgment of Rothbard’s long record of sleazy behavior.

The European had known from my previous Facebook posts that I valued the principle of originality. He had already seen how greatly I worried about this trend, existing even among self-proclaimed free-market advocates, of denying the existence of, and necessity for, originality. One day, not directly solicited by me or anyone else, he posted a YouTube video on my Facebook wall. The video was a documentary about how many artworks praised as innovative actually rely on the revival of older traditional methods. The documentary’s creators did not mean that the incorporations of these traditions diminished the new artwork’s importance but instead strengthened it. But the European was, like the intellectual blog that quoted Mark Twain, more interested in scoring points for the originality-denial side.

He told me that this showed that “Nothing is truly new.”

I got snippy and replied, “That’s right. Automobiles weren’t new in Germany in the late 1800s. People have always had them, even in the caveman days.”

The European didn’t seethe to the extent that Andrew did, but he still was not amused. In his reply he just restated that every artwork that is praised as innovative actually incorporates traditional conventions and therefore he is correct that nothing is truly new.

The European is invoking the same logical fallacy as Mark Twain in the passage that the intellectual blog quoted. I learned about the logical fallacy’s formal name from Raymond C. Niles, and it is called the Composition Fallacy. A Composition Fallacy occurs when someone notices a characteristic of a particular part of X and then presumes that the characteristic applies to X overall. For example, an automobile has parts made out of plastic. It would be a Composition Fallacy to conclude from this that the entire car is made of plastic. Likewise, every normal functioning human eye still has blind spots. It would be a Composition Fallacy if someone concludes that this means that no human eye is capable of sight. And a similar fallacy is employed in the citation of original works’ conventional aspects to deny originality per se.

The assumptions of Mark Twain and the European not being valid, the presence of conventional aspects in an artwork or invention do not negate the aspects that are novel.

Now one might ask if I am engaging in the same Composition Fallacy as that European but in reverse. If an artwork has parts both conventional and unconventional, and it would be a fallacy to say that only one of those elements characterizes the entire artwork, then who am I to conclude the reverse of the European: that this artwork is not all-old-and-conventional but instead all-original? But I am not saying that the artwork is all-original. What I am saying is that the original aspects are important, and that this novelty added by the creator deserves to be recognized accordingly. Moreover, even if only a small fraction of an artwork is original, that already disproves 100 percent the clichés that “Nothing in art is truly new” and that originality is but a “myth.” And it demonstrates, in spite of Steve Jobs’s misquotation, that an artist has alternatives other than “copying” and “stealing.”

That so many people who claim to revere The Fountainhead have recited to me the straw-man attack on originality has taught me the grimmest of lessons. It is that people claiming to revere a book that extols the virtues of originality does not translate necessarily to them valuing those virtues. Such people conveniently gloss over how the book itself had anticipated and rebutted the very straw-man argument they confidently recite. Again, The Fountainhead points out that works of originality do have conventional aspects to them and that those conventional aspects do not invalidate the aspects that are original. Again, Roark notes that the conventional aspects that the innovator learns “from others is only the end product of their thinking” whereas the “moving force is the creative faculty which takes this product as material, uses it and originates the next step.” It is that “next step” that is the originality and which belongs to the innovator, not the predecessors from whom the innovator has learned. 

 The originality-deniers are also answered in Atlas Shrugged. In inventing his new alloy, Rearden Metal, Hank Rearden relied upon the inventions of those who had preceded him in time, such as blast furnaces. On account of that, churlish James Taggart tells a young woman named Cherryl,
He didn’t invent iron ore and blast furnaces, did he? . . . Rearden. He didn’t invent smelting and chemistry and air compression. He couldn’t have invented his Metal but for thousands of other people. [Sounds like Mark Twain.] His Metal! Why does he think it’s his? Why does he think it’s his invention? Everybody uses the work of everybody else. Nobody ever invents anything [emphasis in book].
To that, Cherryl tells him, “But the iron ore and all those other things were there all the time. Why didn’t anybody else make the Metal, but Mr. Rearden did?”

That these originality-deniers are refuted by what they have proclaimed to be a favorite book of theirs makes it seem as though they and I have not read the same book. When Andrew read that exchange between James Taggart and Cherryl, apparently he didn’t get as much out of it as he should have.

Besides it being another repetition of the clichés to deny originality, the exchange with the European lingered with me for another reason. The European was very obviously wrong about how there was never anything “new” in the history of art — even ten seconds of thought should have shown him the illogic of that conclusion. Of course there had to be “firsts” within that chronology — even if the exact instance of a “first” could not be identified — otherwise there would never be any visual forms of artistic media other than cave paintings.

Still, there was an aspect to this where I found myself feeling inadequate in how informed I was. Prior to this exchange, I had not given much thought to the “firsts” in the history of the development of different artistic media. Had the European asked me for examples of “firsts” in the history of art, I would not have been able to cite, with confidence, many case studies.

That brings me to a matter that especially motivated me to write this essay. In the years since that exchange, I have looked into the history of innovation in artistic media. Even when historians cannot identify with confidence the exact party to have originated a particular artistic style or method, the very fact that the artistic style or method now exists — when there was a time when it had not — necessarily presupposes that some party had to be the first to practice it. Again, there can be no trend of change, advancement, or innovation without that trend having been initiated. And in that initiation is the originality.

Still, historians have often found strong evidence by which they can state with confidence that some specific person is the earliest-known — known among these scholars — to have made use of a particular method or style. Hence, I want the rest of this essay to be a litany of examples of innovation in artistic media. When it comes to more-recent developments — that is, from the twentieth century onward — I can even name particular persons as the ones that plausibly may have been the first to have tried a specific style or method.

It is true that sometimes historians name one pioneer, Mr. Y, as likely the first person to have dabbled in a style or method, only later to learn that he had been chronologically preceded by Mr. X, even if Mr. X’s attempts were cruder or less practicable in comparison. Originality-deniers are fond of citing such instances as proof not only that the style’s or method’s “first” practitioner can never be identified, but that this inability to identify such a person also proves that there cannot be one true originator — and, by extension, there should be no intellectual property rights. I often spoke, face-to-face, with a Rothbardian economics professor who would make that sort of manipulative point in debate.

Notwithstanding the logical fallacies of the originality-deniers, the psychological phenomenon of artistic originality is not deligitimized by it turning out that Mr. Y had been preceded in some form by Mr. X. Rather, enthusiasts of art history can still appreciate and credit Mr. Y for being a pioneer, a relatively early practitioner of the style or discipline, and they can also acknowledge Mr. X as having been a predecessor, even if Mr. X’s versions were simpler than Mr. Y’s. And that logic still applies if it turns out Mr. X was preceded by Mr. W, who in turn was anticipated by Mr. V, and on and on. The fact that there had been “firsts” remains; the fact of originality remains. And, not to be fooled by the enemies of patent rights, the fact that our modern versions of products did not emerge in the complete form by which we know them today, but instead took that form through gradual increments by separate inventors over decades, does not alter that fact either. Gainsaying the assumptions of the originality-deniers, the matter does not have to be any more complicated than that.

With that in mind, in my history of innovation in art, I will be giving some examples of possible “firsts.” I can name, for example, the earliest known-by-historians example of an American-made fiction-narrative jungle-safari movie.




“All of These ‘Firsts’ in Art Relate to Changes in Technology, But Not to Basic Ideas Like ‘Love’ and ‘Heartbreak,’ So There’s No Originality After All”
But before providing my litany of case studies, I should make special mention of how many of the innovations in style and method are largely the result of advancements in technology. One might say, “Even if the technologies through which art is presented do change, the fundamental ideas conveyed in art, such as love and grief, have been constants since the Stone Age. Therefore, when it comes to the fundamentals, the most basic themes in art have remain unchanged. Hence, there really is nothing of importance in art that is new or original.”

But such an objection is shortsighted. As Marshall McLuhan pointed out, “The medium is the message.” That is, the type of artistic medium — say, dance versus painting —influences the specific manner in which the ideas are conveyed. And the very manner in which the ideas are conveyed, influences the overall experiences and thereby the manner in which the ideas are processed. This allows for greater diversity and sophistication in which ideas are transmitted.

Furthermore, as societies grow accustomed to changes in technology, it allows for the telling of stories with plots that previously would have hardly been comprehensible to their intended audiences. Consider the motion picture You’ve Got Mail. This is the story of two rival bookstore owners, each of a different sex. On account of their business rivalry, they hate each other. Yet in the privacy of their own homes they log onto the World Wide Web and e-mail one another anonymously, using usernames. And each of these people fall in love with the other’s online persona.

That movie is based on earlier works. They are a play, talkie movie, and musical movie — respectively, Parfumerie (1937); The Shop Around the Corner (1940); and Good Old Summertime (1949). As these productions took place prior to there being a World Wide Web, naturally they are about a different technology. In these versions, the two romantic leads are co-workers who dislike one another in their everyday face-to-face encounters. However, again privately in each of their respective separate homes, they correspond by snail mail. In writing their love letters, they do not use their real names.

Even though the communication/information technology that is used as a plot device is different, You’ve Got Mail and Parfumerie have the same basic premise. That may initially seem to be evidence that technological leaps cannot change the basics of the plots of fiction. But consider the Stone Age when all human societies were hunter-gatherer clans.

It was not only that there was no postal service, but there was a point in history, prior to agriculture replacing hunter-gathering as the main source of food, when people did not live in huts separate from one another. At this point, there was little distinction between “family” and “society”; the entire clan was a family unit in that most of its members were related genetically, not far removed removed from one another. (Although people did not know consciously about genetic diversity, genetic diversity was maintained when different clans came into contact and swapped spouses.) Except for shamans conducting their rituals in caves surrounded by cave paintings that seemed to be animated by firelight, most people had much less privacy then than they did back then. And this was prior to written language.

It was not as though someone could produce an anonymously-authored document expressing one’s love for someone else and then, by means of privacy, have that document delivered to that object of amorous intentions. These ancient hunter-gatherers already were telling stories around a fire at night — often the shaman was the storyteller — and some of these were love stories. But for ancient hunter-gatherers who had only encountered other hunter-gatherers, a love story such as You’ve Got Mail would be incoherent. The plot is understandable to us — even if we do not find the movie entertaining — not only because of changes in technology, but also as a result of the extent to which our society has grown accustomed to those changes.

For such reasons, it is legitimate to say that when artists use new technologies to express themselves, the new technologies allow for new ideas to be explored. They also allow for old general-ideas to be examined from new angles, resulting in new specific-ideas. Accordingly, new styles and artistic methods resulting from technological changes do influence both the presentation of ideas and the ideas presented. In short, when changes in artistic styles and methods were mostly the result in developments in technology, the objection that they do not count as innovations in art-as-such does not withstand scrutiny. Insofar as an innovation in artistic style or method resulted mostly from advancements in technology, that is still an innovation-in-the-arts per se.

Moreover, even when the ideas are old, there are “firsts” in terms of how the old ideas are presented in new artistic media. As I type this, the earliest-known written use of first-person pronouns is in a work of art. It is a poem and chant from Enheduanna, an ancient Mesopotamian princess and high priestess. She speaks both for herself and other citizens of their Akkadian city-state in praying to a goddess.

The early history of written and performed artworks in the Bronze Age was tied to civics: these artworks were overseen directly by the leader of the society and were promoted as advancing the society as a whole. That was a norm even for pre-agricultural clans. Often dances and story-tellings were performed by people publicly — not that privacy, again, was much of an option back then — in a manner intended to promote the interests of the community. Examples would be rituals said to have honored the gods and spirits so that they would shower good fortune upon the community.

The ancient Greek lesbian poetess Sappho exemplified something that, in comparison to most of the human history that preceded her, was quite new. She and other Greek poets began to write, on parchment, poems expressing their own personal feelings. No longer was verse confined only to rituals of civic importance; it could also be for one’s private personal reflections and satisfaction.

Yes, people had already felt love and physical attraction for millennia prior to Sappho. But being able to write on paper — a then-relatively-new technology — within the privacy of a walled-up enclosure definitely helped Sappho express those ancient emotions in a manner that was new. Even if she was not the first poetess to do this, she was one among a relatively small number of participants in an innovative trend. Love and romantic attraction were not new, but the manner in which this new artistic movement had explored and articulated these feelings was indeed new and original.

Mindful of that consideration, let us get on with the survey. I hope such a survey, though far from exhaustive, will expose to readers the absurdity of insisting that there is no originality, that nothing in the history of art has ever truly been new.

And so it begins.




Theories on How Cave Paintings and Cave Sculptures Were Invented
I have said that if it were true that there was no innovation in the arts, then there would be no form of visual media other than cave paintings. But I must acknowledge that cave paintings themselves were an innovation. There was once a day in antiquity when there were hominins but no cave paintings. And then at some point, cave paintings did exist. Someone had to invent them.

Social scientists such as Izzy Wisher have a theory on how this may have happened. Psychologists recognize a phenomenon called pareidolia. It is when you look at one sort of object, and patterns of shade or color or shape in it remind you of an entirely different sort of object. It is pareidolia, for instance, when you look at a cloud and say that its shape resembles that of your cat.

The theory is that pareidolia played a role in inspiring the first figurines and first cave paintings. According to the theory, some ancient person saw a formation of rock in a cave and thought it already resembled a person or animal, such as a bull. However, it occurred to this person that the formation could be reshaped further. Hence, the person carved and reshaped the rock formation to become an even stronger likeness of the person or animal. These would become the first statues and figurines.

Something similar happened with cave paintings. Some ancient humans noticed that different types of rock come in different colors. They noticed that images could be made on the wall of a cave by smearing, onto the wall, wet clay or sticky dust particles from a different sort of mineral. At some point, they saw markings already existing on the cave wall and, again, pareidolia took effect. Of the markings already present, some already looked like people or animals. Once again, they smeared other types of mud and dust onto the markings to alter them, making them into more well-defined representations.

Archaeologists and anthropologists have various theories about these cave paintings. One is that they were made for private rituals carried out by shamans — also what would be called “witch doctors” or “medicine men.” The shaman may have believed that an image of a particular sacred animal had captured its supernatural power. As I said in the discussion of You’ve Got Mail, the shaman may have prayed to the cave painting. Some theorists argue that cave paintings reached a point of sophistication such that, when the shaman lit the cave’s interior with a small fire, the cave wall’s images were painted in such a manner that the dancing of the fire light upon the cave images had created the illusion that the cave painting’s objects — the people and animals — were themselves moving with the fire light.

Even in the case of the first cave paintings, the innovation has identifiable traits that are well-established and those that are novel. The already-familiar traits are the already-existing markings and rock formations that, through pareidolia, immediately called to mind some humans or particular animals. The novelty was in the choice of humans to take the initiative in altering the markings and rock formations further to attain the results they wanted.

Ancient figurines and statues of women have shapes that, today, we would not associate usually with feminine nudes. The ancient statues of women have very decidedly pair-shaped torsos. And they are not life-sized. From this, we can segue into a discussion of innovation in sculpture in Hellenic and Hellenistic culture.




Dianne Durante’s History of Innovation in Western Sculpture
A good survey of innovations in sculpture in ancient Greece is that of art historian Dianne L. Durante in her book and Medium series Art History Through Innovators: Sculpture. She lets us know from the start that only a tiny percentage of ancient Greek sculptures that have existed have survived to this day and have been recovered. For that reason, it is difficult to identify with confidence who were the earliest artists to have used a particular method. Still, the general century in which a new method has been spotted to appear — and some specific sculptors associated with that method — can still be named.

Ms. Durante starts with the Bronze Age. This was not something Durante mentioned in particular, but this was a time when Mesopotamians erected statues of deities with the heads of men and the bodies of winged animals. Ms. Durante begins with a culture which had important contact with these Mesopotamians, that of Bronze-Age Egypt. Starting from around 3000 BCE, one can find the earliest examples of a particular innovation. The Egyptians produced their first realistic life-sized statues of men. The hair looked very stiff, as if the hair was a hat instead. The arms were always to the sides and there was no space of separation between them and the torso. The left foot was usually farther forward than the right, and the legs were usually unseparated from a larger slab that kept the figure standing.

I will say something that is not a point in Ms. Durante’s history. That the ancient Egyptians provided this and other Bronze Age innovations — such as Egyptians inventing geometry for commercial reasons — is unsurprising in consideration of how Egypt and Babylon were freer than most other human societies of their time. Archaeologists have found that the pyramids were built not by Hebrew slaves but by free Egyptians. Many of those laborers had farmed for most of the year and sought employment in constructing the pyramids in the seasons when the Nile flooded and made farming impossible. Egyptians sought this employment voluntarily and they were well-paid. The items that accompanied their burials indicate that Egyptian rulers thought well of these people and that they were high-status.

Egypt’s relative freedom also possibly led to the development of the alphabet. Canaanites voluntarily migrated to Egypt for work. Even when the pay was low compared to that of the native-born citizens, it paid more than the sort of work in the place of origin left behind. These Canaanite immigrants were, in that regard, similar to many of today’s migrant workers to the USA and other Western countries. These Canaanites found work in Egypt’s turquoise mines. They were able to observe the religious rites of the wealthy Egyptians who employed them. These Egyptians included the use of hieroglyphs in their ceremonies as Canaanite employees gaped in admiration.

Aspiring to be like their employers, at first the Canaanite miners tried to emulate these rituals. When it comes to innovation, the visual similarity to hieroglyphs can be called the familiar part. But these Canaanites had a limitation the wealthy Egyptians did not — they were still illiterate. Hence, according to Egyptologist Orly Goldwasser, these Canaanites produced pictograms visually similar to the Egyptians but without the same meaning.

Over time, though, each pictogram came to be associated not with an entire word but just a specific sound. That was the part that was novel. Unable to understand the written language of others, these Canaanites developed their own from the bottom-up. These former illiterates had taught themselves to read through a most unexpected method — inventing their own form of writing. For almost all of the consonants known today, each was represented by a visual symbol that was a modified and simplified version of an Egyptian pictograph.

The Bronze Age economic collapse led to a decline in the business activity that made use of cuneiform writing. As the Canaanites brought about economic recovery in their own trading, their heavy use of their own alphabetic writing had supplanted the cuneiform format that had previously dominated Eurasia. Those Canaanites’ descendants, the Phoenicians, undertook a prodigious amount of trading, including with the Greeks. From the Phoenicians' docuemnts of their trades with the Greeks, Greek merchants learned the alphabet. Then the Greeks added the vowels.

Returning to Ms. Durante’s history, she mentions that there was not much change in the Egyptian sculptors’ conventions until 300 BCE. Durante makes it a point to mention how this means there was relatively little change in 2,700 years. And just as was the case with the use of the alphabet, in realistic life-sized statues of men we find another instance of something that began in Egypt and which was developed further by the ancient Greeks.

Even when the new ideas did not arrive via Phoenician middlemen, Egyptians initially influenced the ancient Greeks in the Bronze Age and then into the Iron Age. But for the Iron-Age Greeks, innovation in sculpture occurred at a much more rapid pace. In particular, Durante focuses on the evolution of the Greek kouros — idealized statues of young male nudes. They appeared around 600 BCE, and at first they closely resembled the Egyptian statues. The torsos and hair styling were similar, and once again the left foot was farther forward than the right. But besides the kouros being nude instead of in a man’s skirt, there were some other important differences. Though the arms were still at the sides, the elbows and forearms had space between them and the torso. The figure stood without a large backboard, and the musculature was more defined.

By 450 BCE — 150 years later — Greek sculpture had already seen dramatic advancement. By this point sculptors were making use of a scientific principle that modern Italian practitioners call contrapposto. This refers to how, as someone stands, the weight being placed upon one part of the body causes the shoulders and hips to shift their position accordingly to maintain balance. Many of the statues from the fifth century BCE applying that principle are attributed to the sculptor Polycleitus.

In the earlier part of the fifth century BCE, there was another technological innovation. Sculptors did not always have to rely on stone. They had learned the technique of Middle-Easterners in casting life-sized figures in bronze. The advantage was that bronze was easier to shape. From the 440s to 430s BCE, Pheidas carved statues where there was much detail. When the statues depicted clothed figures, he made realistic pleats in their clothing. The facial expressions of these statues, though, were always neutral.

In the 300s BCE, there was another new trend, this one exemplified by Praxiteles and Lysippus. Prior to the works of these two men, ancient Greek statues were carved with the intention of having been seen mostly from the front. Even when the sculptors first exercised the principle of contrapposto, the statues were posed in such a manner that all of the important details of the action in the pose could be taken in from the front view. By contrast, Praxiteles and Lysippus caved important details in the statue from every angle. An onlooker would have to walk in a circle around the piece to observe every pertinent feature. Lysippus’s details were so precise, as well, that he could carve what historians and archaeologists suspect were very accurate likenesses of real people. The most famous bust of Aristotle is attributed to him. One may note the precision of detail in the subtle wrinkles in Aristotle’s face and in the minute separation of hairs in his beard.

Around the 350s BCE, Scopas added something new: nuanced facial expression. In contrast to Pheidas’s statues always having neutral faces, Scopas used the subtle shapes of the eyelids and curves of the lips to convey a variety of emotions, such as trepidation and sorrow.

By the 200s BCE, there was greater variety in the sort of figures presented. No longer was the medium confined to images of idealized men. Sculptors produced representations of elderly women and of small children. They also tried to produce accurate representations of the clothes of foreigners with whom Greek merchants came in contact.

It has become fashionable for intellectuals to proclaim that it is a myth that there was a “Dark Age” in Western Europe in the duration separating the Western Roman Empire’s decline from the early years of the Renaissance. Nonetheless, there is a noticeable economic decline in this location and duration. That definitely applies when assessing the degree of sophistication in Western European sculpture throughout the ages. The archaeological record evinces that it was not until around the 1400s CE with artists like Donatello that the old Greek and Roman methods were recovered.

Ms. Durante goes on further in her survey to note of the Renaissance bringing about a new attitude and appreciation for sculptors that was less apparent in Western Europe’s early Middle Ages. But here I will conclude my own essay’s heavy reliance on Ms. Durante’s history. There are more-recent case studies in innovation in artistic method.




Firsts in the History of Motion Pictures
The term lost media refers to artworks and other information-technology documents that are considered “lost” to posterity because there are currently no units of them known to remain. The fact that, as Ms. Durante has noted, the vast majority of ancient Greek and Roman statues ever to have existed no longer do is a fact that puts them in the category of “lost media.” Likewise, most of the earliest documented-to-have-existed motion-picture prints shown in theaters have been lost, destroyed over time by the elements. However, thanks to many journalistic accounts still available, it is easier to find contenders for possible “firsts” in filmmaking than it is with ancient sculpture.

The first motion pictures shot in the United States came not from California but from the East Coast. New Jersey was home to Thomas Edison’s endeavors, and in Chicago there was another pioneer whose name is little-known today. It was Col. William Selig, a stage magician turned filmmaker. In his biography subtitled The Man Who Invented Hollywood, Chapman University historian Andrew Erish makes the case of Selig’s business likely having been the first one in the USA to attempt several genres and storytelling devices that have since become well-established.

Edison’s movie The Great Train Robbery from 1903 is considered one of the earliest westerns. But it was shot in New York and New Jersey. Selig’s company was the first to shoot westerns on location in the actual American West with real cowboys and Native Americans. Edison’s What Happened to Mary from 1912 is thought to be the earliest serial for theaters. But each installment is self-contained, ending with resolution to that episode’s story. There was also the three-part crime drama Fantomas from France in April 1913. By contrast, Selig’s The Adventures of Kathlyn, beginning its run in December 1913, is the earliest-known American theater serial in which, except for the final one, every installment ends on a cliffhanger.

Of all motion pictures that are fictional depictions of adventures in jungles, the earliest-known one came from Denmark in 1907. That was Lion Hunting. The first fiction movie to come from an American company about such a safari arrived in 1909 with Selig’s Big Game Hunting in Africa.

Additionally, Selig’s company was behind the earliest-known horror movie produced in the USA, Dr. Jekyll and Mr. Hyde in 1908. This movie, though, consisted simply of shooting a stage production of the literary classic. The movie even begins by showing a stage’s curtains opening and, correspondingly, ends with them closing. In 1910 Edison’s company followed that with the first screen adaptation of Frankenstein. In contrast to Selig’s movie, the scenes of this Frankenstein were shot in separate locations; it is clearly not confined to a single stage.

Around the same time, France had its own stage-magician-turned-filmmaker. This man, Georges Méliès, pioneered various special effects and editing techniques that continue to be employed. As noted by YouTube filmmaker David Yeaman, Méliès pioneered in such techniques as stop-edits, dissolves, and double exposures.

A stop-edit is done when the movie camera is stationary. It shoots for a little while. Then the filmmaker stops shooting. Between takes, the filmmaker changes something about the objects within the camera’s view. Then the filmmaker starts shooting again. The effect is that, when you watch the movie afterward, it looks as though the change is instantaneous. If you first shoot an object, then stop the camera, remove the object while keeping everything else the same, and then start shooting again, in the playback of your movie it will appear as though the object instantly disappeared into nowhere.

This editing technique is the basis for the stop-motion animation orchestrated by Willis O’Brien in King Kong, by Ray Harryhausen in The Seventh Voyage of Sinbad, and by Phil Tippett with Jabba the Hutt’s rancor monster in The Return of the Jedi. The puppeteer shoots one frame of his puppet, moves the puppet slightly, shoots another frame in the new position, and so on. The final product onscreen is the illusion that the puppet moves on its own, albeit jerkily.

Retired East Carolina film studies professor James C. Holte says that by some anecdotal accounts, Méliès may have initially stumbled on his stop-edit technique by accident. As the story goes, Méliès had his camera rolling as he shot a scene out on a real street. A streetcar came into view exactly as the film jammed. After unjamming the film Méliès resumed his shooting. At that exact moment, a hearse was in the same area of the shot where the streetcar had been. When Méliès watched this reel later, he noticed that it seemed as though the streetcar had instantly transformed into a hearse.

In the early twentieth century, some people liked to say that beneficial “accidents” happen only to those who have good judgment on what to do about them. Many other people in Méliès’s position would have dismissed this optical effect as a defect and moved on. Méliès, by contrast, considered how this optical effect could be employed in his storytelling.

Méliès’s “dissolve” effect entails a similar method. It involves one image gradually fading out as, simultaneously, another image fades in. When everything in the shot is kept the same except for one spot where one object fades out and another fades in, the optical effect is similar to what happens with stop-edits, except the change is slower. This effect has been used in classic Universal monster pictures where the face of Lyle Talbot changes from that of a cleanshaven man to that of a furry Wolf Man. Slow dissolves have also been used with two dissimilar images to represent a passage of time that is longer-term than what actually commences onscreen.

Finally, Méliès pioneered in the use of double exposures. This is when one image is superimposed upon another. When this causes a person to appear see-through, the technique can be used to make that person appear as a ghost.

Innovation continued as the era of silent film gave way to movies in which discernible speech could be heard — “talkies.” And here is an uncontested “first” in cinematic arts. 1927’s The Jazz Singer was the first feature-length talkie, emitting sounds and speech synchronized to correspond with the events shown in the image moving onscreen.

A very decidedly innovative movie was 1933’s King Kong, and not merely because of Willis O’Brien’s then-new stop-motion animation. The movie ushered in the use of a technique called miniature screen projection, which was itself derivative of another technique that was still new at the time, rear screen projection.

Kong contains several scenes of the human actors interacting directly with the stop-motion dinosaurs. For many consecutive frames, both the human actors and the dinosaur would be onscreen together and looking at one another. For audiences to find these interactions believable, the movements of both the actors and the dinosaurs would have to be timed to match. Absent of the proper timing, both the human actors and the dinosaurs would seem to be thrashing randomly unprovoked. To solve this, O’Brien employed rear screen projection and then his own miniature screen projection.

With rear screen projection, the following happens. First the filmmaker shoots a particular snippet of footage. Then the filmmaker shoots the scene onset that is to incorporate that snippet. Behind the objects or actors being filmed at the moment is a glass plate or transparent screen, and behind that glass plate or transparent screen is a movie projector. As this new scene is being shot, the projector projects, onto the glass plate or transparent screen, the snippet of footage that is to be incorporated into the new scene. As the new scene is being shot, the actors in it see the projection of the snippet onto the glass plate or transparent screen and are thereby able to time their movements and lines accordingly.

Even with this method as it was, there were still issues. With King Kong, it was the humans’ sides of the interactions that were shot first. For Willis O’Brien there remained a discrepancy in the timing: the human actors’ movements were filmed in real time whereas O’Brien had to shoot a single frame of his dinosaur on the miniature diorama set he built, turn off the camera, move the dinosaur slightly, shoot another frame, move the dinosaur again, etcetera. To rectify this, O’Brien 1) invented his own special miniature camera for shooting miniatures clearly and 2) employed the rear-screen projection method in a manner convenient for his specialized needs.

On O’Brien’s miniature diorama set, behind the dinosaur, was an area where he set up a stretched-out sheet of clear rubber. Onto this rubber he projected the snippet of footage of the human actors who were fighting against, or running from, the dinosaur. However, rather than have the snippet play out in real time, O’Brien had the projector project just one frame. He would then move the dinosaur’s body slightly in accordance to where the human actors were and how the dinosaur was to react to them. 

With that being done, O’Brien shot his own frame of the dinosaur’s action in relation to where the humans were and what the humans were doing in the corresponding frame from the already-shot snippet. Subsequently, onto the miniature diorama’s stretched-out transparent rubber O’Brien projected the snippet’s next frame of the human actors. Then O’Brien would once again adjust his dinosaur puppet to provide a proper response to the humans’ movements. And then he would shoot that change.  O’Brien repeated this rather tedious process until he had a seamless final product in which it looked as though the human actors really could see the dinosaur that was growling at or pursuing them. 

For this development, O’Brien was awarded U.S. Utility Patent No. 1,897,673 and U.S. Utility Patent No. 2,029,500. I doubt that Rothbardian originality-deniers will like that last part.

Finally, you can see innovation and originality in art in how motion pictures have used the storytelling device of “flashbacks.” When I saw 1941’s Citizen Kane, I considered it overrated. However, I did not see it in the same historical context as those who had experienced this movie upon its initial release.

The movie begins with the death of the titular character, newspaper mogul Charles Foster Kane. The story of his life is then told in the form of “flashbacks.” When this movie first arrived in theaters, such a storytelling device was uncommon. The movie employed many such cinematic techniques that, at the time, were relatively unfamiliar. And then it was exactly on account of how groundbreaking Citizen Kane was, that led to its storytelling devices being used in subsequent movies and thereby becoming commonplace.

In literary fiction, the use of flashbacks was much older. The Arabian Nights story “The Three Apples” begins with the discovery of the body of a woman who had been slain. Then her killer recounts the events that led to this incident. This was also a relatively early instance of the murder-mystery genre.

The earliest-known example of a motion picture to have a flashback in it was France’s silent film Histoire d’un Crime in 1901, another crime story. In the USA, D. W. Griffith invoked the technique in 1918’s Hearts of the World.

The earliest-known “talkie” to have flashbacks was 1931’s City Streets starring the same lead of the Fountainhead movie, Gary Cooper. Flashbacks then became more prominent and more frequent eight years later with Wuthering Heights. Much as with the novel that served as its source material, the Wuthering Heights film adaptation has the housekeeper Ellen recount what had happened in the past concerning Heathcliff. Most of the French Le Jour se Lève, or Daybreak, from that same year was dramatized as a flashback. And then, exactly forty years after a motion picture first employed the method, audiences first became well-acquainted with, and entranced by, flashbacks in Citizen Kane.

I hope that, by now, these case studies have demonstrated the following.
  1. Throughout the history of art, there have indeed been instances where someone attempted something that, heretofore in history, had been unprecedented. Hence, in the history of art, there has been much that has been “truly new.”
  2. Historians and scholars are able to make strong cases that a particular artwork may well be the earliest instance of a particular innovation appearing, or that the artwork is at least the earliest-known instance of that innovation. In many instances where some historical figure has been lauded for introducing something novel, those accolades were indeed well-earned.




A “Great Artist” Is a Second-Mover Who “Stole” the First-Mover’s Relatively New Innovation and Then Put His Own Spin on It?
Prior to concluding this essay, I want to address an alleged second interpretation of “Good artists copy, great artists steal.” According to this second interpretation, the saying is less about denying originality’s existence than it is about highlighting what can be called a Second-Mover Advantage in intra-industry competition.

The idea goes something like this. In the past, many people assumed that the first party to introduce a particular innovation to the marketplace — presumably the original inventor — would have an advantage over all would-be competitors vying to sell similar products. On that understanding, that original innovator would be the “first mover.” But, goes this argument, case studies in the history of business reveal that often the financial success and even acclaim for the innovation go not to that first mover but instead to yet another party, a “second mover.” This other party sells versions of this sort of product when that sort of product is still new, but this party is still not the original innovator.

Consumers come to associate the innovation with that second mover, possibly even assuming the second mover is the true originator. And, when the second mover puts his own spin on the innovation, historians might consider the second mover’s rendition to be superior and more praiseworthy than that of the first. Hence, it is this second mover, not the first, who is considered the “great artist.” This second mover did not necessarily steal anything literally from the first mover, but this second mover did — not always by intention — “steal” credit and accolades that ostensibly should have gone to the first mover instead. Hence, “great artists steal.” That is the conclusion of the argument.

Steve Jobs himself appears in the most famous case study. When the Apple Macintosh hit stores in 1984, for consumers it was revolutionary. It was the first time they used a mouse to maneuver a cursor on what was called a graphical user interface (GUI). As far as many of these consumers were concerned, Apple Computer must have originated this technology. Consequently, consumers hailed Jobs as a “great artist.” However, this technology was originally developed by Douglas Engelbart, other engineers at Xerox’s Palo Alto Research Center (PARC), and various universities. Apple Computer did not steal this invention literally; it paid licensing fees to Xerox. However, as Jobs tacitly allowed the public to associate this technology with Apple instead of Engelbart and Xerox PARC, Jobs “stole” their applause.

Similarly, when moviegoers from the late 1930s to early 1940s encountered Citizen Kane, this was often the first movie they could recall in which the story was told with the framing device of flashbacks. This, among many other attributes of Citizen Kane, made writer-director-producer-star Orson Welles a “great artist.” Yet, to this day, even many of the most seasoned cinephiles remain unaware that a motion picture had made use of flashbacks four decades prior in Histoire d’un Crime. In that respect, it might be said that “great artist” Orson Welles “stole” flashbacks from Histoire d’un Crime.

The word steal usually has a stigma to it. Yet many people who say that the Macintosh’s incorporation of the GUI exemplifies the principle of “Great artists steal” do not say this not with disapproval toward Steve Jobs but, perversely, a cynical admiration. The insinuation is often “Jobs was not being the most moral person by conventional standards. But in terms of getting all the glory — the same sort of glory I aspire to — Steve Jobs won. He won to a degree much greater than Douglas Engelbart. And I have to hand it to him there.”

After all, many movie buffs aware of Histoire d’un Crime will say that Citizen Kane is still the higher-quality movie. They will say that the higher quality proves that Orson Welles remains a “great artist.” And they will add that they do think he “stole” the flashback motif from Histoire d’un Crime, meaning that it is true that it is the “great artists” who “steal.”

Even in these cases, I find both the cynicism and the use of the word steal misleading — even when Jobs spoke it himself. The fact is that Steve Jobs, not Engelbart or Xerox, is the party who brought this economic value to the market and, hence, to consumers. Had Jobs not taken an interest, this technology would have remained confined to the laboratory, unseen by consumers for many more years. The act of Jobs and Apple having brought this technology to market when they did — whereas Engelbart and Xerox did not — is itself an economic value for which Jobs, and not Engelbart, deserves credit. Likewise, we can acknowledge Engelbart, Xerox PARC, and the other university computer scientists for being the parties to have developed the earliest versions of this important technology. There is no inevitable conflict there.

It was in an interview for Robert X. Cringely’s PBS documentary series Triumph of the Nerds where Steve Jobs first told the public that his motto was “Good artists copy, great artists steal.” In that interview, the context was that Jobs disclosed that he did not want Apple Computer’s personnel to be hobbled by a Not-Invented-Here attitude. That is, Jobs did not want Apple’s engineers to dismiss other companies’ ideas simply for having come from outside of Apple. When Apple’s competitors had objectively good ideas, Jobs said, Apple’s engineers should consider those ideas and apply them when they can.

That is fine, but such incorporation of others’ ideas should apply in no more than at the level of the general. Applying the best general ideas of others should not come at the expense of other people’s rightful claims of ownership over their own specific original works. And, as I hope this essay has made clear, those rightful claims of ownership are principles that are moral and reputational, not just legal. Morally, rightful respect for someone else’s intellectual creations extends beyond application in court with the enforcement of patents and copyrights. Rightful ownership, as Howard Roark can attest, also consists of crediting the innovation’s originator and recognizing that originator’s moral authority in exercising control over her original presentations and designs. When Steve Jobs says that great artists such as himself do “steal,” it seems to be an inadvertent, passive admission that Douglas Engelbart has not received, reputationally, proper due for the Macintosh’s best features.




Conclusion
That intellectual blog I cited earlier — the one that gleefully quoted Mark Twain, Alexander Graham Bell, and Henry Miller about “The Myth of Originality” — also provides a quotation from Steve Jobs that is more nuanced than “Good artists copy, great artists steal.” This time, Jobs says in a 1996 interview with Wired magazine,
Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. That’s because they were able to connect experiences they’ve had and synthesize new things.
That quotation, much like T. S. Eliot’s original “Mature poets steal” quotation, is nothing better than a mixed bag. Jobs acknowledges that particular individuals — as opposed to society in general — are indeed bringing about “new things” (emphasis mine) and that these particular individuals are “creative.” The part about “connecting things” refers to the manner in which an innovation, per Aristotle, takes hold of some familiar ideas and conventions and then “connect[s]” them to a new context in which they are unfamiliar.

But then Jobs says these people are “just connecting things,” the word just insinuating that this is not the big deal it is made out to be. Lest anyone dispute that he meant to downplay the grandeur of it, Jobs continues that these innovators “didn’t really do it.” That anticipates President Obama declaring to entrepreneurs proud of their own economic value-creation that “You didn’t build that.” This is related to the cliché with which this essay started. If the innovator “didn’t really do it,” then the innovator, despite being a “great artist,” cannot properly take full ownership over her innovation. In that respect, the thinking goes, the innovator’s own innovation is something the innovator “stole.”

Innovation indeed involves connecting familiar ideas with unfamiliar contexts, but innovation is not “just connecting things.” It is a lot. As Ayn Rand observed, the innovator did do something new, and over that novelty, the innovator has the rightful primary claim of full ownership.

I said it on Twitter years before, and I say it again: “Good artists copy, great artists steal” is an insipid cliché that needs to go out of circulation. It should go extinct in Silicon Valley and everywhere else.

It is only by the vaguest definitions and the loosest criteria by which someone can say that throughout history no one has truly exercised originality in artistic media. It is merely by defining artwork only by the vaguest criteria, and by looking at the concepts no more closely than at the most general level, that someone can say that everything in art is a copy of other artwork. But as with the specific designs, delineations, and presentations protected in copyrights and patents, it is the matter that originality in art is defined by specifics. The Arabian Nights story of “The Three Apples” made use of the flashback motif centuries prior to the silent movie Histoire d’un Crime doing it. But Histoire d’un Crime being the earliest-known motion picture to have done it means that Histoire d’un Crime remains a cinematic “first” and should be remembered as such.

That principle remains in effect even if one judges that Citizen Kane executed the flashback motif in a manner more artful, cohesive, and aesthetically pleasing than did Histoire d’un Crime. Remembering Histoire d’un Crime for one sort of accomplishment does not take away from Citizen Kane, and vice versa.

To look at art history and to use logic is to discredit the cliché that in art “Nothing is truly new.” There was a time when there was no motion pictures, let alone motion pictures making use of flashbacks in their storytelling. Now there are many motion pictures that do that. It stands to reason that some party was the first to do it, and, in so doing, thought specific thoughts and took specific actions absent of exact historical precedent. At every increment of creative change and progress, there is something truly new.

Yes, talented artists and innovators do learn from the techniques of others and they do apply those lessons, on a general level, to their own work. By the same token, the history of art and technology is still the history of many then-unprecedented “firsts.”

And through proper documentation, often historians are able to find evidence by which they can discern which party was likely the first to do what. By that method, even specific artists who are not as well-remembered as Orson Welles can get their due. These are great artists.

Great artists don’t steal. Great artists originate. Period.