return to table of content

Copying is the way design works

karaterobot
58 replies
22h17m

As a designer, I feel the need to be original. If you’re a designer, or even if you’re just interested in design, you probably feel the need to be original, too.

I've been a professional designer since 2006, and I got over that thinking pretty quickly. A designer trying to be strikingly original is rarely acting in service of the design. If you want to be strikingly original, you probably want to be an artist instead of a designer. What a designer fundamentally does is communicate the best solution to a problem, given the requirements, goals, and constraints of that problem. Originality is subordinate to that at best.

burningChrome
28 replies
21h35m

This.

I was a UI/UX guy for about 5 years and worked for a company that pumped out thousands of sites a year. A bunch of their designs won awards and I saw their model and thought I could do that, it seemed easy.

The hitch was that I was going to design really cool sites, with all kinds of animations, huge text, have really cool navigation menus, etc. In short, I had a very romantic idea that I would dictate some incredible design to my clients. I thought I was like the Frank Lloyd Wright of design and whatever I showed people they would swoon and then go with whatever uber cool thing I showed them.

Reality set in with my first client. Same thing, they didn't want cool shit, they just wanted their potential clients to find information about their work and contact them to hire them. After another 4-5 clients, I suddenly realized that web designers aren't some artist creating ultra cool, ultra rare stuff that your clients must absolutely have like a Banksy piece, they have more fundamental problems they're trying to solve and want you to solve them for them.

I got my ego checked in a hurry, but it was a good lesson to learn. You're not selling art, you're selling a solution to their problems.

ozim
9 replies
20h28m

It is not only that. For example wannabe EDM DJs think they have to be creative and find tracks that no one ever heard to be edgy or whatever… most of people pay for having cookie cutter songs played so they can dance and have a good experience and they don’t want to be surprised on EDM event - well there are big names that can do whatever they want of course but that is different expectation.

The same with software devs that they think, it must be “framework like code, extensible, reusable that will be there for 20 years” - well no if it is crud app most likely it will be trashed in 2 years stop overthinking and just do it :)

caseyohara
6 replies
13h9m

Wow, this couldn't be further from the truth. It might be true for DJs playing "main stage" style EDM (poppy mainstream music) but for most electronic subgenres – especially techno – the crowd absolutely expects the DJ to be a superb crate digger and pull out new and deep tracks they've never heard before.

No one goes to a techno club to hear rinsed tracks; they want the DJ to show them music they've never heard before. Before the digital age, people would go out to see touring DJs specifically for their collection of rare records that no one else had and you couldn't hear anywhere else. This is still true today in the more underground scenes. It's the opposite of cookie cutter.

jrflowers
4 replies
10h12m

Most DJs do not make their livings at techno clubs. The majority are hired to play bars and events that do not cater to particularly discerning audiences.

phpnode
3 replies
5h16m

Exactly. The DJs that are innovative, crate digging, slightly pretentious music nerds are almost exclusively hobbyists, with a vanishingly tiny percentage of them being able to eek out a meagre living from it. The majority of full time DJs cater to mainstream audiences who absolutely do want to hear the same 50 - 100 tracks on rotation every time they go out. They want to dance and sing along to music that they're familiar with, and if the DJ doesn't play what they know then they won't dance, they won't stay, and the DJ won't remain employed.

It's actually very similar to web design - innovation has its place, but 99% of the time people want familiarity.

Source: spent a decade as a professional DJ

bluGill
1 replies
4h34m

A typical DJ is allowed one weird song nobody has heard before. If it is a long show maybe one per hour. The rest better be songs the majority of people know and sign along with.

phpnode
0 replies
2h56m

Yes, ideally played when the dance floor has been full for a while to encourage people to go to the bar and buy another drink.

runelk
0 replies
59m

You might have a bit of confirmation bias based on the particular environments you've been in.

Having taken part in various types of electronic music/art scenes since the early 2000's, I've met all kinds of people. Local hobbyist bedroom producers playing for free. Semi-professional artists juggling gigs&touring with one or more side jobs. Full-time DJ's playing everything from small underground parties to some of the biggest parties/festivals at the time. They all cater to their audience to varying degrees, mainstream or not.

Granted, the scenes I've bumped into tend to be on the non-mainstream side. That's where you can actually go professional being that "innovative, crate digging music nerd" you refer to (removed "slightly pretentious" because that hasn't been my experience). It's tough, but it can be done, and it's a larger group of people than you seem to think.

I've also met some professional DJ's that fully cater to the audience in the way you describe. Many of them make statements like yours like e.g. "99% of the time", "almost exclusively hobbyists", "slightly pretentious", etc. I really don't get why, because it's just not true, and it comes across as a bit defensive or passive-aggressive to be honest.

I mean, of course there is the mainstream audience of the type you describe. But even that audience changes its opinion about which 50-100 tracks they expect you to play on a regular basis. That change has to come from somewhere, otherwise they'd still demand disco tracks from the 70's. That somewhere is the stuff that hasn't gone mainstream yet, and while the percentage of people that can make a living of it is probably not very high, it's a lot higher than what you claim it to be.

That vast overlap between underground/alternative scenes and the mainstream is super interesting, and I'm pretty sure that if you included that part into your statistics, you'd see a different picture.

NB: I might have a bit of confirmation bias based on the particular environments I've been in ;)

carlmr
0 replies
11h49m

It's like the perfect conterexample, a good DJ needs to have really good taste and constantly listen to new tracks and think about where they can be used.

Designers everyone thinks are more creative than they are. DJs most people think are less creative than they need to be.

maeil
0 replies
5h14m

I'm not sure how to phrase this in a way that complies with the spirit of HN (open to suggestions!) but that's a pretty American take on electronic music.

It's not necessiraly wrong, but it holds just as much for any other genre of music and the choice of "EDM" to make the point is pretty typical.

britzkopf
0 replies
1h43m

Whoa there. I've not done the polling but I suspect most EDM consumers think of themselves as music lovers the same way other people like jazz etc. I tend more to agree with your assessment that the vast majority of it is not in the same category as "real" music but I don't think attendees of a rave would go along with that.

DigiEggz
9 replies
15h52m

Do you have any examples of some "cool" sites that you designed, even prototypes? You piqued my interest.

burningChrome
8 replies
13h45m

When parallax scrolling was cool and different, I designed an architecture site for a local architect using the effect.

It was very similar to this site where you had jarring transitions, background changes and images moving at different speeds. https://doubble.group/sg/

The end result was very similar to site above and we all got a lot of positive feedback when their current clients saw it because they were blown away. While I was busy separating my shoulder patting myself on the back - we realized a few months in, the engagement was horrendous. The leads from their contact form dried up to almost nothing. Analytics showed an insane drop off from the home page. None of the internal pages were getting any traffic. We quickly realized that nobody could find any content on the site, they couldn't get to the contact page very easily, the content was hard to find and or read because of the motion and animation that constantly took your focus off of what you, as a user, were trying to do.

We had up for four months before having to pull it and put up their old site, then re-design another simple, more refined site that would work better for their users. It was a great lesson to learn about solving problems or trying to create something cool that nobody could use.

We also designed a site for a local event to support a women's shelter and used parallax again to tell a story of how women are shuffled through a system that does little to protect them from their ex-husbands or violent abusers.

It used the same techniques in this, where you had both horizontal and vertical scrolling in both directions to show a timeline and story with illustrations and infographics. https://collagestudio.ca/en

This also got a ton of good feedback and we had a few other non-profits approach us to do something similar for them and we did a few more using the same template we had, but switching some elements to make it original for each client. This worked out much better because if people were able to digest the story and the points you were trying to make, it had a better impact than new clients trying to find specific content and the contact page.

Hope that helps!

carlmr
7 replies
11h45m

When parallax scrolling was cool and different, I designed an architecture site for a local architect using the effect.

A friend of mine designed such a site with his web design company. I instantly thought, amazing but horrendous.

On that note, why is Apple still successful with this? Everything is moving on their website.

lancesells
1 replies
3h53m

They use their product landing pages as a commercial but you can see they make that top nav easy to get to buying it or tech specs.

Personally, I dislike scrolljacking but the other animated elements that come up in the viewport are pretty well-done. It's all ultra-sanitized and corporate but there's a lot of effort and finesse put into it.

Phone comparison landing pages:

https://www.apple.com/iphone-15-pro/

https://www.samsung.com/us/smartphones/galaxy-z-flip6/

https://store.google.com/category/phones?pli=1&hl=en-US

pimlottc
0 replies
3h40m

The iPhone 15 Pro page is a good example, as soon as you click “Buy” or any other link on the top nav, the pages become much more conventional while still retaining a consistent and functional style. The dynamic scroll hacking stuff is only on the main marketing pages.

whywhywhywhy
0 replies
7h4m

On that note, why is Apple still successful with this? Everything is moving on their website.

Think of it more like a scroll controlled trailer and that the metric they might be working on it the longer a customer spends on the site the more likely they are to convert.

Reason I think it's this is their sites are mega long and information packed with everything moving these days.

melagonster
0 replies
8h56m

I don't know anyone bought iPhone from their website, so this is not so important.

krisoft
0 replies
7h48m

On that note, why is Apple still successful with this?

It is very possible that they are successfull despite their web design choices.

ben_w
0 replies
8h52m

On that note, why is Apple still successful with this? Everything is moving on their website.

IMO that's Apple being a high-fashion trend-setter rather than good UI/UX design.

The current choices of videos they auto-play actually give me motion sickness, which I don't normally get from video content.

DrScientist
0 replies
9h13m

On that note, why is Apple still successful with this? Everything is moving on their website.

Everything apart from the navigation bar at the top - which is well organised and static ( over-time ).

ie in terms of the functional 'find-stuff' part of the site - it's all there in the top few pixels(1), and the sub menus. The rest is entertainment.

(1) There is also a footer at the bottom of the scroll - with a whole host of simple links - if you get that far and haven't found what you are looking for.

chefandy
6 replies
13h38m

The hitch was that I was going to design really cool sites, with all kinds of animations, huge text, have really cool navigation menus, etc. In short, I had a very romantic idea that I would dictate some incredible design to my clients. I thought I was like the Frank Lloyd Wright of design and whatever I showed people they would swoon and then go with whatever uber cool thing I showed them.

hmmm... That approach is anathema to every other UI designer or UX person I encountered in that field. The core of UI design is 100% about clarity-- letting the user focus on exactly what they need to solve their problem. The core guiding principle of UX work is designing based on empirical research, and then iterating based on user testing... even if it doesn't work out like that in practice, it's still laser-focused on helping the user achieve what they need.

Did you transfer into the field from a non-web-design background? The people I've seen approach web design with the intent of making some sexy website that's flashy for its own sake were a) front-end developers that thought the technical know-how was the hard part, b) branding and identity designers, or maybe print designers that never had to consider designs that people actually had to do stuff with, and c) small-org IT people that were sick of IT and were charged with maintaining the organization's website so they figured it would be an easy switch.

bluGill
3 replies
4h29m

UI and UX designers had their heyday in the 1990s. Every UI I see today shows that UX designers were not invited to have input.

chefandy
1 replies
54m

That's weird because the dozens of UI designers and also UX designers and researchers (UI design and UX Design are not the same thing) I know are employed doing exactly what they were trained to do. If you think UX was at an apex in the 90s, you haven't actually looked.

bluGill
0 replies
5m

There are more today than the 90s for sure. However there are a lot more UIs around, and the big players don't give the UI and UX design people near as much control as they did then and so bad UI dominates. today's flat UI fad would not be allowed in the 90s.

Suppafly
0 replies
2h32m

UI and UX designers had their heyday in the 1990s.

But also back then, anyone could and did call themselves a UI/UX developer because it was trendy to do so and paid well. Most weren't actually good at it.

The5thElephant
1 replies
2h30m

So many comments here are just anecdotal experiences pretending to be absolute statements.

Web design used to be filled with ridiculously detailed and "over" designed websites that rarely were hyper-focused on clarity or efficiency of communication. It's only recent years where that has become such singular focus, and in turn has created a sentiment that UI and web/app experiences have lost their charm.

Many of the currently popular marketing site designers in the design community do come from UI/UX and web-design backgrounds, and they are popular because they design over-the-top big-text animation-filled websites that catch your eye.

The core of UI design is not "clarity". That is one adjective you can aim for, and you will find a wide range of opinions on what it means and how to measure whether you were successful or not. But "user interface/experience" does not imply it HAS to be an efficient one. Some UI/UX is designed for delight and delight alone.

The person you are replying to got into the industry with the same attitude most UI/UX designers I know had starting out. The people who approach it with your attitude have mostly been engineers. In the end most meet somewhere in a happy middle.

chefandy
0 replies
39m

So many comments here are just anecdotal experiences pretending to be absolute statements.

Well I've got a pretty recent design degree and have a lot of exposure to what people are thinking and how people are practicing in this field. If you've got some empirical evidence that challenges that, I'm happy to consider it.

Web design used to be filled with ridiculously detailed and "over" designed websites that rarely were hyper-focused on clarity or efficiency of communication.

Yes, I've been in the field for decades. For most of the internet's history, web design was done by "web people" and not designers. Additionally, lots of it has been done by visual designers and not interaction designers-- that yields very different results.

It's only recent years where that has become such singular focus, and in turn has created a sentiment that UI and web/app experiences have lost their charm.

So where's your non-anecdotal support for this absolute statement?

Many of the currently popular marketing site designers in the design community do come from UI/UX and web-design backgrounds, and they are popular because they design over-the-top big-text animation-filled websites that catch your eye.

Sorry, no. Most people who put marketing sites together come from advertising, which is almost exclusively filled with visual designers. There's nearly no reason for a marketing website to employ the services of either a UI designer or a UX designer. There are a lot of people-- as you can see in this comment section-- that call themselves UX designers that don't even realize how wrong they are. Just like there are lots of people who cargo-cult PHP snippets from tutorials that call themselves software developers, or even software engineers. Again, if you have any non-anecdotal evidence that says otherwise, I'm happy to look at it.

The core of UI design is not "clarity". That is one adjective you can aim for, and you will find a wide range of opinions on what it means and how to measure whether you were successful or not. But "user interface/experience" does not imply it HAS to be an efficient one. Some UI/UX is designed for delight and delight alone.

The fact that you say UI/UX is telling. While a UX designer may concern themselves with UI design, they are not even close to the same field. UX is about product design, overall. UI design is a communication discipline in the vein of HCI in which the goal is to communicate the functionality of a program to a user. While there are lots of colloquial misuses of these terms in companies that don't really focus on these things, any organization that has codified design practices and structured design roles that actually needs to define what these people actually do all day uses them correctly.

The person you are replying to got into the industry with the same attitude most UI/UX designers I know had starting out. The people who approach it with your attitude have mostly been engineers. In the end most meet somewhere in a happy middle.

I'm an art school trained designer having switched careers from web development. Most engineer types I've encountered call anyone that touches the front-end without coding a UI/UX designer, and think the purpose of design is aesthetic. I've had dozens of discussions on HN, specifically, with developers that think exactly that. Within the big UX organizations I've worked with and fellow UI designers, what I've said is the rule rather than the exception. Go and look at UX portfolios for people with professional experience in the field-- they're full of case studies, not visual design, and CERTAINLY not flashy visual design.

f1shy
0 replies
9h12m

In UI you want to be anything but original. It should be as “the same” as possible.

m12k
9 replies
20h35m

Also, one of the most important UX principles is for things to work the way the user expects. And unless you are the market leader, those expectations are mostly built based on all the other designs that your users interact with, rather than yours. So to the extent that originality means diverging from those expectations that are built elsewhere, it is actively doing your users a disservice, by not letting them leverage the expectations and muscle memory they already have. Building on paradigms that others have established as the norm means meeting users where they are.

robertlagrant
5 replies
20h1m

Right. "Intuitive" mostly means "I have seen this elsewhere."

rachofsunshine
3 replies
19h37m

As a concrete example, the idea of a mouse was once counterintuitive to users because they'd never seen one before.

Windows included Solitaire with the OS in part to introduce ideas like "click" or "click and drag" to users that were unfamiliar with GUIs, by linking them to physical concepts users did understand ("oh, I have a physical card, I can grab it and move it around, that makes sense!").

moomoo11
1 replies
16h16m

Wow that’s cool. I remember my dad was addicted to solitaire lol

carlmr
0 replies
10h51m

He was playing the tutorial all along.

robinsonb5
0 replies
5h35m

was once...

...and is rapidly becoming so again, hence modern UIs treating it as a second-class citizen.

hnick
0 replies
10h59m

Applies to other things like music too. Sometimes people are ahead of their time and most people can't digest it.

xinayder
2 replies
9h44m

While it makes sense, it really isn't the case if the market leaders have shitty design principles.

Apple stopped bundling an iPhone charger on recent models. Samsung did the same, but realized the backlash was enormous, and offered the charger for free (instead of being an additional purchase) if you bought a recent model.

Same with headphone jack, although it was received much more negatively and I'm pretty sure Samsung didn't give a damn about most of its users complaining they now had to buy new headphones (they mitigated this a bit by offering a USB-C headphone on their flagship devices for a while) to listen to music in their devices.

It's an outdated line of thought to think you need your designs to feel familiar to the user, even if the competitors have dark or annoying design patterns, rather than convenient. The average user is no longer a tech illiterate person. We should stop assuming common things like opting out of marketing/AI data training should be left for advanced users only and make it available for everyone, with ease.

krisoft
1 replies
7h12m

I don't think you are quite talking about the same kind of design here.

You are talking about very high level choices. (Do we bundle a charger with every new phone? does the phone have a jack?) Those are not really good examples where familiarity is important.

The argument about the importance of familiarity is in the UX paradigm of the phone. Think about the task of pairing a phone with a wifi network. You usually do that by unlocking the phone, finding the settings (which is most likely under an icon resembling a gear, or a spanner, even though neither of those things had anything to do with setting up wifi). Then inside the settings you have a long list of things you can set, you can move between them by dragging the screen up and down. You find the menu item for wifi (probably has a wifi logo, or radio waves icon) you click that. Then you see something where you can turn the wifi radio on or off, and you see a list of SSID's you can join. You click the one you want to join, and it asks you for the password associated with the network. Usually you can tap the password field and an on-screen keyboard appears where you can type the password in.

This is by and far the way to connect to a wifi on any modern smart phones. This is the "familiar".

To better illustrate what "lack of familiarity" would mean imagine a phone where instead of finding the wifi settings in a "settings" menu you can connect to a new wifi in the maps app. Why? Wifi networks are location dependent, so why not? These designers decided that wifi networks appear as small colourful dots on the map. Then imagine if after tapping your selection from the list of SSID's you would need to push a button on the side of the phone to "accept" it. Otherwise it won't connect. Then imagine that instead of showing you an on-screen keyboard to type in the password you need to morse-code tap the password in by tapping the back of the phone. The phone would indicate this to you by showing an icon of a drum kit.

This is what "lack of familiarity" would look like. Clearly this imaginary phone would be very hard to use, and the users would reward the manufacturer's creative thinking with a lot of returns and complaints.

robinsonb5
0 replies
5h37m

Another example of this: when I bought my current phone it took me well over a week to figure out how to put it on silent, because the option to do it is no longer on the control panel you sweep down from the top of the screen.

No, now I have to actually adjust the volume to make the volume indicator / slider pop up, and then the mute button is visible.

If the volume slider was accesible through the regular on-screen interface, I might have looked for a mute button alongside it. But pressing the volume up / down buttons didn't occur to me, because those buttons are for nudging the volume one step in either direction, not for making hidden UI elements appear.

guappa
3 replies
11h30m

There's a full time design team where I work. The menu items got decided before they all got hired.

What they do is to move them around every few months, change the colours, design our application with a mobile layout, despite 99% of our users being on desktop computers…

nox101
1 replies
11h5m

they need to be punched in the face!

Designers that change up the UI just to have something to do drive me nuts! And, now it's life threatening because cars now get updates every ~12 months where some designers have decided how to use the car changes. So, things you got used to suddenly change and you have to figure them out WHILE YOU'RE DRIVING!

I hope someone manages to sue over these changes when someone inevitably dies, so they'll be some pressure not to make them.

carlmr
0 replies
10h48m

I still think cars need to be controlled through hardware knobs. At least all the normal car functions.

GPS and AndroidAuto/Apple CarPlay interaction should be the one exception, but even here you need a volume knob at least, so that if the volume is suddenly too high you can react with muscle memory.

Because surprisingly high volume can distract you enough to crash.

ErigmolCt
0 replies
6h48m

Missing the mark in terms of aligning with user needs

strogonoff
1 replies
9h58m

The interesting thing about design is that once you combine the good parts from preexisting approaches (not least because those are patterns familiar to users already), relevant first principles (visual hierarchy, legibility), forward thinking (sustainable architecture with flexibility in the right places), and the context of your specific circumstances and goals, you most likely will end up with something sufficiently original without making an extra effort in that direction—and, rather than being an artist’s whim, it would be true beauty arising from function.

ErigmolCt
0 replies
6h51m

I think true beauty in design arises from its function. It’s a beauty that serves a purpose, solves problems, and meets needs

raincole
1 replies
12h21m

If you want to be strikingly original, you probably want to be an artist instead of a designer.

Copying is the way art works as well (at least for those who are not doing super-edgy-fine-art).

Typical journey of a digital painter:

1. Refuse to copy. Refuse to even look at references.

2. Hoard references. Over reference.

3. Copy in the right way.

tetha
0 replies
11h36m

Music isn't very different either. A common recommendation is to first learn to play songs you like, and then to start diverging a bit and to adjust the things you don't like and merge the ideas you like later on.

hyperbolablabla
1 replies
7h19m

At a company I used to work at, the head of design told me that artists work to establish their vision, and designers work to establish the audience's vision - something like that. Made a lot of sense to me!

ErigmolCt
0 replies
6h50m

It gives a profound distinction between the roles of artists and designers

bingemaker
1 replies
7h53m

In my experience, designers mistake web design with print design. Too often, the focus is on the UI rather than on the UX.

ErigmolCt
0 replies
6h46m

A common challenge in the design industry

psychoslave
0 replies
14h20m

Even an artist as to meet something that can resonate within its public to be called thus, otherwise the person might be creative but lake the social dimension which is a preponderant trait of any artistic practice. Much like the difference between creating a language of your own for exclusive usage in your diary and creating a language enshrined in some literature work like Tolkien did.

markk
0 replies
5h29m

Designers should feel the "need to be original", in the sense that every project is different, and can be looked at with fresh eyes.

Perhaps a project is 50% similar to existing project A, 45% similar to existing project B, and 5% novel. Finding this correct balance of copies of A and B, and finding a good solution to the novel part - this process feels "original" in many ways.

joveian
0 replies
6h40m

My sense (and I think this matches what you are saying) is that the best design is to make habits smoother and doing that often involves difficult engineering. We like to think about reasons and meaning and purpose and such but humans are primarily a collection of habits with rare changes in intent and a lot of correcting for stuff that doesn't go smoothly. The best design often becomes almost invisible because it just works, but it takes a lot of design effort and engineering for that to be possible. If you write down the differences between a great design and an ok design they can often sound entirely trivial but aren't if you think from the perspective of habits.

I recently found this really excellently designed grain bin from Masuda Kiribako:

https://kirihaco.shop-pro.jp/?pid=181616902

It looks nice but is fairly simple; if you haven't spent a bunch of time looking at available alternatives it might not look like anything special. Keeping grain away from insects and humidity and oxygen (and sometimes rodents, though I'm not sure how well this one would do in that case) while still being able to access it easily is not trivial. Plastic buckets work well and are cheap but don't look as nice and most lids are annoying (I suspect the lid on this one might possibly be a bit annoying as well but likely not as bad). Glass jars are nice to use but fragile and best for smaller amounts. Wood is particularly challenging due to the dimensional instability and they use a particular type of wood with something like eight years of preparation to make durable boxes. (I suspect the magnet on the scoop is pure marketing though, you can't even use it when refilling if you hook the lid on the edge which is the one time it would be really handy).

I think low latency is one of the things that makes software and websites feel really nice to use and is often overlooked.

gyomu
0 replies
19h1m

Design is about navigating ambiguity, and finding which fine lines to walk when resolving tensions inherent to opposing constraints in any sufficiently complex problem space. There is rarely a single best solution to such problems.

Originality certainly has a role to play in there - many (most?) iconic products were strikingly original. Would the iPod have been a better designed product with a D-pad (or other standard button arrangement) over its scrollwheel? Or the Wii with a standard gamepad?

Originality and novelty (particularly when it comes to visual aesthetics) are forces people respond to, and great designers know how to channel those forces in constructive ways for their work.

gchamonlive
0 replies
16h48m

Maybe even if you copy and reuse virtually everything the composition can be original if you are not just applying the latest trends blindly because it's shiny and new. I come from IT architecture and for me this is the best transposition of originality for the concept of "there is no silver bullet". Do you think this also counts as originality?

atoav
0 replies
12h3m

As a former professional designer (and current improvisational musician( myself I would even come at it from the other direction: There is no true originality to begin with. Everything you do borrows from somewhere else, except maybe the things you do by accident.

But there is still a difference between a designer who blatantly slaps an existing aesthetic onto your project and a designer who tries to come up with a suitable look from first principles.

Design isn't styling, it is the visual organization of information with styling. So unless your information is the same the outcome will differ anyways.

ErigmolCt
0 replies
6h53m

Your distinction between designers and artists is particularly compelling. While artists have the freedom to prioritize personal expression and originality, designers usually balance creativity with functionality. Agree.

karmakaze
41 replies
23h14m

This is a great quote:

In the middle of Apple’s case against Microsoft, Xerox sued Apple, hoping to establish its rights as the inventor of the desktop interface. The court threw out this case, too, and questioned why Xerox took so long to raise the issue. Bill Gates later reflected on these cases: “we both had this rich neighbor named Xerox ... I broke into his house to steal the TV set and found out that [Jobs] had already stolen it.”
turnsout
28 replies
20h53m

This is such a frustrating misunderstanding of the history, and the history is fascinating. Xerox invited Apple to tour PARC in exchange for $1M worth of pre-IPO Apple stock, which today would be worth [checks notes] more than that. There was no theft.

Apple engineers got to see the Alto, not the Star (the screenshot in the article is wrong, the chronology is wrong). The visit was so fast that Apple engineers thought they saw realtime overlapping windows when they didn’t. [0] So it’s possible Xerox was inspired by Apple with the Star, not the other way around.

Meanwhile, Bill Gates totally outs himself as someone who would steal shamelessly.

[0]: https://folklore.org/On_Xerox%2C_Apple_and_Progress.html

iczero
17 replies
20h43m

There was no theft.

I didn't know that touring somewhere meant you could copy all their designs. Was that explicitly stated?

cortesoft
11 replies
19h0m

Why would they give away 1m in stock just to look?

lolinder
8 replies
17h41m

OP misrepresented what happened (intentionally or otherwise). They didn't give away $1M in stock, they granted Xerox the right to buy $1M in stock.

tptacek
6 replies
16h18m

Same difference. At some point you're just arguing about what the number was. I don't care about the broader argument, but the right to buy $1MM of private shares is, in fact, real consideration.

lolinder
5 replies
16h4m

Is it real consideration that anyone could consider to be worth "all the IP that you happen to be able to see while on this tour", or is it real consideration that Xerox thought was worth "an opportunity to see how we run the best tech lab of our generation"?

Don't forget that Apple presumably got paid $1M dollars out of the deal in addition to the tour. I'm having a hard time seeing the argument that the right to pay Apple for some of their shares in 1979 was perceived as being worth any of Xerox PARC's IP, much less "as much as you can carry in your head".

(None of which is to say that Apple was wrong to copy what they could, morally or legally. I just find the argument that these shares are evidence that it was an above-board trade that Xerox was on board with to be very weird.)

tptacek
3 replies
15h55m

I don't care; as far as I'm concerned, that argument is isomorphic to "would $1MM literal dollars be enough for what they saw, or should the number be higher". Maybe. But the right to buy shares is not really a meaningful distinction to the simple issuance of shares in this historical context. That's all I'm arguing.

rsanek
1 replies
13h38m

in retrospect? sure. at the time? radically different.

just ask your average sv startup employee if they think options and RSUs are the same

tptacek
0 replies
13h22m

Options and RSUs are both universally acknowledged as consideration. That's all I'm arguing. Should it have been $1MM in stock, or $50MM, or $200k? Hell if I know.

viridian
0 replies
5h28m

What? It absolutely is.

One has a face value of $1,000,000.00

The other has a face value of $0.00

If Apple offered me one of these right now it would completely change my life, and the other would be something I wouldn't even take them up on.

turnsout
0 replies
3h59m

It was 100% an above-board trade. Xerox at that point had poured an ocean of money into PARC and hadn't really seen any return on it. I don't think they would have seen it as an IP transfer, because it wasn't. Apple didn't rush out to implement Smalltalk. Instead they were inspired by the principles, misunderstood some stuff, and came up with something 100% better for the average consumer.

Xerox wasn't stupid—they were trying to get some value out of this research lab that was, on paper, lighting money on fire.

turnsout
0 replies
4h4m

You're right of course—another way to put it is that Xerox got the chance to make a $1M investment in pre-IPO Apple.

lupire
0 replies
18h45m

They didn't. They got paid $1M!

iczero
0 replies
18h50m

Was it only for the tour? Was it actually $1M, or did it later increase in value? Did Xerox value a strategic relationship? "A whole load of ideas" seems worth more than $1M to me, especially if you're Xerox back then.

kragen
3 replies
16h12m

in general you can copy all of someone's designs even without touring. exceptions are when they're covered by copyright or patent

immibis
2 replies
15h20m

Which they are by default.

rsanek
0 replies
13h40m

not by patent

kragen
0 replies
5h48m

no. copyright has a rather limited scope that excludes the aspects of design you most want to copy; patents must be applied for and received; and both expire

turnsout
0 replies
4h6m

If you read a bit about it, you'll understand that Apple did not copy the Alto—they did something that was actually way harder. They created a better version of what the Alto was attempting to do, and got it to run on lower-grade hardware they could sell for 1/10th the cost.

But yes, Xerox knew exactly what they were doing when they invited Steve and his team in.

lolinder
2 replies
17h44m

Your version makes it sound like Apple gave Xerox $1M in stock in exchange for the visit. Most sources I can find don't mention the stock at all, but the one that does [0] makes it pretty clear that the offer was to let Xerox buy stock from Apple pre-IPO in exchange for the tour, which is a very different story:

Jobs's company stood on the precipice of a public offering guaranteed to make him and any investors wealthy, and the tech guru's impending good fortune enticed the suits at Xerox to make him an offer he couldn't refuse: Let us buy shares in your company, and we'll give you a peek inside the greatest minds in your field.

[0] https://www.newsweek.com/silicon-valley-apple-steve-jobs-xer...

turnsout
0 replies
3h58m

You're right—bad phrasing on my part

gen220
0 replies
1h14m

There was no explicit trade of stock for tour. The investment by Xerox in Apple happened before such a tour, for entirely separate reasons. There was another Apple exec who was managing pre-IPO external investor interest. There isn't a recorded reason for why Apple chose Xerox (among other investors), if I remember correctly.

However, Steve did leverage the fact that Xerox was an investor to bully the on-site engineers into providing him with the "executive" demo, after receiving the run-of-the-mill public demo and learning that a higher-tier demo existed. It involved a phone call to east coast Xerox Corporate, who instructed the on-site engineers to provide the full demo. The PARC lab director was OOO that day, and later said in an interview that he would have stonewalled Job's request.

Another fun fact, Xerox sold (the majority of?) their stake in Apple almost immediately for a quick turnaround profit post-IPO. Obviously there's no recorded reason for that trade, but my impression is that they didn't think Apple had what it takes to build a proper OS (they envied their ability to cheaply assemble hardware, but viewed their software suite/R&D as a big moat).

Edit: also, Apple engineers were already mid-working on replicating PARC tech at the time of the (not-so-) "fateful" demo. It was behind schedule and not-demoable, and Steve was getting frustrated with these facts. They encouraged Steve to visit PARC to get a preview/demo of what they were building, to ground him to their works' value. There were already a handful of ex-PARC people at Apple for a while, at this point, and PARC's work stems from "The Mother of All Demos" given decades ago. The Alto was unique in its early implementation, but not in its ideas.

Hitton
2 replies
20h46m

Meanwhile Steve Jobs: "We have always been shameless about stealing great ideas."

albumen
0 replies
9h48m

top comment: "The credit should be given to poet T S Eliot (1920): “Immature poets imitate; mature poets steal; bad poets deface what they take, and good poets make it into something better, or at least something different. The good poet welds his theft into a whole of feeling which is unique, utterly different than that from which it is torn.”

and its reply: "So really it has nothing to do with stealing whatsoever"

gen220
0 replies
1h21m

He's famous, so his view is the one that's most commonly regurgitated, but that doesn't make it the most correct one.

Steve's point of view is one point of view, in a story that involved ~10 people. When you hear the story from each person's point of view and union them, the subtly-incorrect aspects of his perspective become pretty glaring.

lupire
0 replies
18h46m

The price Apple charged for that stock was more than Apple's IPO price, AIUI. It wasn't a giveaway.

JKCalhoun
0 replies
4h44m

Thank you for that. I was going to post something similar.

What the Apple engineers did was take obvious inspiration from what they saw at PARC but then ended up going in a different direction when they actually had to both implement it and make it workable as an OS. The overlapping windows is the most oft-cited innovation they came up with but there were many other perhaps more subtle ones.

The impression I have though is then that Gates basically copied Apple engineering, not PARC.

manav
5 replies
22h42m

Copying from Xerox, some irony there.

karmakaze
2 replies
22h25m

It's also a lesson that always seems to fail to be learned. Xerox had the capital to set up a research arm, then failed to convert on any ideas because they were too focused on their current cash-cow. They eventually transitioned to "document company" where a document wasn't only paper, but it was too little/late.

razakel
0 replies
8h55m

The same thing happened to Kodak - they were a tech company that thought they were a chemical company.

kragen
0 replies
4h56m

you're talking about xerox parc in the 70s

xerox parc in the 70s invented the laser printer. the laser printer has been almost all of xerox's business since last millennium. their current revenues are 7 billion dollars a year, almost entirely from laser printers (mostly in disguise.) so i'm not sure it was 'too little, too late' or even 'failing to convert'

(right now i think xerox is unprofitable, but that's an issue of profit margins and management, not an issue of not having revenue)

this article https://spectrum.ieee.org/xerox-parc has this pullquote:

From a purely economic standpoint, Xerox’s investment in PARC for its first decade was returned with interest by the profits from the laser printer.

and that was in 01985

you could posit an alternative history where xerox wasn't just making billions of dollars a year, for generations, out of laser printers, but also owned the entire market of laser printers, semiconductor foundries, guis with overlapping windows, ethernet, wysiwyg document editing, page description languages, and object-oriented programming, because all of those were indeed invented at parc

the article does in fact implicitly posit that alternative history. but it isn't clear that it was ever a possible history. centrally planned economies are not good at innovation; decentralized ones are. the most significant invention on that list, the semiconductor foundry, isn't even technical; it's a business structure that decentralizes chip design

very possibly they've made more money from their early stock in apple than they ever could have made by trying to exclude everyone else from the overlapping-window-gui market

freetinker
0 replies
21h46m

I miss this flavor of advertising. It all feels too anodyne these days.

wiz21c
4 replies
22h17m

I broke into his house

Not fun at all. Microsoft is like Disney, they steal from others and trounce others for stealing from them.

Absurd people.

zogrodea
2 replies
20h5m

I'm not doubting, but can you give a few examples of Microsoft trouncing others?

I do recall Disney (a main reason copyright laws last so long, and who didn't want Steamboat Willie to enter public domain).

I also think of Amazon (which the creator of the Elm programming language describes as having "the Jeff problem" because they steal smaller people's/team's ideas), although that's a different problem.

I can't say anything comes to mind right now about MS, though, which is most likely a failure of my memory/knowledge. So I'd appreciate some examples.

lupire
1 replies
18h41m

Did you miss one of the largest antitrust cases in history, litigated for a decade?

jpc0
0 replies
3h4m

... they steal from others and trounce others for stealing from them.

Explain how the antitrust case or its conclusion proves the quote above?

Mozilla/Netscape had a browser that could have competed with Microsoft, they didn't litigate them for stealing. The antitrust case was about it being effectively impossible to overthrow their monopoly because of the platform being locked down.

The current crop of litigation against Apple reeks more of the Microsoft antitrust case than any Disney cases.

mandmandam
0 replies
20h50m

Sad to see this extremely historically accurate and relevant comment downvoted.

And on the forum which should most know it to be true!

opo
0 replies
16h42m

This quote like many others are funny because people pretend that they can exactly remember things said decades earlier. Andy Hertzfeld later updated this entry on his website:

"....Here is Mike Boich's recollection of the 'Xerox' story, which goes a little differently, and is likely to be more faithful than mine: The meeting was one of the quarterly meetings, where Steve, Mike Murray, Belleville, you and I all got together with the Microsoft crew, which at was usually Bill, Jeff Harbers, Jon Shirley, and sometimes Neil and/or Charles Simonyi. I don't recall whether Windows had been announced, or we were just concerned about it, but Steve was trying to convince Bill that having a "Chinese wall between the Windows implementers and the Mac implementors wasn't sufficient for us to work well together. He was trying to get them to forget about the OS business, since the applications business would be much bigger total dollars. He said, "It's not that I don't trust you, but my team doesn't trust you. It's kind of like if your brother was beating up on my brother, people wouldn't say it was just your brother against my brother, they would say the Gates are fighting with the Jobs." Bill responded that "No Steve, I think it's more like we both had this rich neighbor named Xerox, and you went in to steal the TV, and found that somebody else had stolen it. So you say, "hey, that's not fair. I wanted to steal the TV"."

https://folklore.org/A_Rich_Neighbor_Named_Xerox.html

pembrook
17 replies
21h43m

Copying isn’t just how design works, it’s how everything works. Humans are imitation machines.

We create new things by collecting, regurgitating and mutating stuff we experience, just like LLMs. In a vacuum man has no ideas outside of base impulses.

Hence why originality is a novice belief. The closer you get to any field, the more you realize the stories around who made all the breakthroughs are BS media narratives. Most if not all steps forward in any field have hundreds of people clawing at similar ideas concurrently.

kivle
13 replies
18h3m

It's very Hackernews to throw LLMs in there, but I agree. LLMs don't have experience though. They have training data, and a probabilistic output.

Designing things have two goals:

- Make old things seem new

- Make new things seem old and familiar

Both need a lot of knowledge about how humans work and how we have made sense of the world up until now. Design can't be made in a vacuum and without input.

Edit: To expand: An LLM would never have come up with touch input. It would have regurgitated the existing ideas of using a pen or a mouse to point at things on a screen. To come up with touch input was a huge feat of human engineering that was a combination of design (making touching a obvious for any human, old or young) and engineering (making that interaction actually work).

visarga
10 replies
17h32m

LLMs don't have experience though.

They might not have had experience 2 years ago, but in the meantime they assisted 100s of millions of people for many billion tasks. Many of them are experiences you can't find in a book. They contain on-topic feedback and even real world outcomes to LLM ideas. Deployed LLMs create experiences, they get exposed to things outside their training distribution, they search solution space and discover things. Like AlphaZero, I think search and real world interaction are the key ingredients. For AZ the world was a game board with an opponent, but rich enough to discover novel strategies.

kivle
9 replies
17h2m

This sounds like an ad. What is "assisted 100s of millions of people in many billions of tasks"? Any real world data? If it's generating new random clip art for presentations, sure. If it's making new flavor text based on generic input, sure.

If my question is "what is the circumference of earth", and I run a model with a temperature of 100, will it give me a good result? Will it always give me a good result with a temperature of 0? I don't think so. It's a huge probabilistic model. It is not an oracle. It can be useful for fuzzy tasks for sure, but not for being smart. You might think it's clever because it's generated code for you, but that's probably because you asked it to make something 500 people already made and published on GitHub.

Edit: Just to clarify. Don't want to step on peoples toes. I just feel like we're at the top of a new dotcom/crypto/nft hype boom. Seen it soooooooo many times before since the beginning of the 2000s. Don't go blind on technology. Research what it actually is. An LLM is a "next word weighted dice toss machine".

Kiro
4 replies
14h6m

You might think it's clever because it's generated code for you, but that's probably because you asked it to make something 500 people already made and published on GitHub.

An LLM has no problem coming up with novel solutions within my own esoteric physics framework that only exist on my computer, using my patterns and taking all the nuances into account. It's definitely not just spitting out something it has seen before.

DJBunnies
3 replies
7h34m

Uhhh, physics?

Kiro
2 replies
7h28m

Yes, that can be used in a game engine for example.

jpc0
1 replies
2h58m

I think the point being made is that there is very little room for creativity there... There are tons of examples of physics engines written in multitudes of languages with a full range of quality of implemention.

Now if the LLM had known to look for and found dark matter or gravitational waves all while sitting there on your computer comparing micro changes between CPU cycles, maybe you would have a point. To my knowledge most physics engines do the even emulate Newtonian physics nevermind more modern variants

Kiro
0 replies
11m

I'm not talking about the physics calculations. I'm talking about it navigating, adapting and coding using my own patterns, coding style and structure within a context that is completely custom. It understands the framework I've built, what functions to use when and writes code looking and working as if it was my own.

kivle
2 replies
16h41m

And to expand on myself. "Experience" means something very specific for humans. It means you have done something for a long time, and you have failed at it. And you have learned from experience. By definition, LLMs don't have any experience at all. They are trained and become a fresh "brain", and then they make the same mistakes, over and over and over, until they either get a new prompt that might correct themselves or are trained from scratch all over again.

visarga
1 replies
13h27m

What I meant is

1. LLM generates an idea and

2. the user responds positively or negatively or

3. the user tries the idea and comes back to continue the iteration, communicating the outcomes.

For example the LLM generates some code and I run it, and if it fails I copy paste the error.

That is the (state, action, reward) tuple which defines an experience.

jpc0
0 replies
2h54m

Sounds like the LLM facilitated a human to gain experience, by making mistakes for the human and then correcting those mistakes also likely in an incorrect way. LLMs are effectively very very bad teachers.

The LLM given the same inputs tomorrow is likely to return similar responses. If a human did that they would likely be concidered to have some sort of medical condition...

visarga
0 replies
13h30m

What is "assisted 100s of millions of people in many billions of tasks"?

Let's assume 180M users for ChatGPT, each user using the model just 5 times in a month. You got the 1B tasks. If one task uses up 1000 tokens - you have the trillion interactive tokens. It's all based on public data and guesstimation.

rbits
1 replies
16h43m

I don't know if I buy that an AI wouldn't have been able to come up with touchscreen. It knows people touch screens with pens, and it knows people point and touch things that aren't screens. It could put those ideas together, that's how people came up with

kivle
0 replies
16h28m

It's not an AI. It's a word probability model. Trained on tons of text. It's not smart at all. The only reason why it might have "figured that out" is because it was actually present in scifi texts in the 60s. The other reason would be that you increased the temperature to something like 100, and then you think you see something genius in some hallucinations, among other unreadable text.

tracerbulletx
0 replies
19h53m

This is why not much changed for 10s of thousands of years until writing was invented, and accelerated when a valid method of iteration (science) was instituted.

Lammy
0 replies
1h30m

This is also why Peter Thiel et al are so obsessed with René Girard.

Philosopher René Girard, scholar Robert Hamerton-Kelly, and Thiel co-founded IMITATIO in 2007 to support the “development and discussion of René Girard’s ‘mimetic theory’ of human behavior and culture.” Mimetic theory, the concept that humans are fundamentally imitative, has had a profound effect on Thiel, who calls Girard “the one writer who has influenced me the most.”

https://www.theverge.com/2016/12/21/14025760/peter-thiel-het...

Affric
0 replies
14h51m

It goes deeper than that. We are copies of our parents made out of self replicating molecules… copying is fundamental to pretty much everything interesting that has ever happened.

ilrwbwrkhv
15 replies
23h6m

Beautifully written article. One of my first ideological shifts happened when Napster was released. Bits flowing freely without being bounded by rules of the physical world deeply changed me and while later I do understand artists need to get paid and make a living, piracy and the pirating community is still very close to my heart. The amount of innovation which comes out of that space, is tremendous. The fact that zuckerberg could create trillions of dollar on free projects such as php and apache is not cherished enough.

I think we still haven't found a proper economy for the digital world. The fact that pirating game of thrones was a better option than waiting for it to be premiered in your region goes to show there is still a lot of work to be done in this area. If there wasn't piracy, free software, open source and american VC (the first few waves, not the last few), this industry wouldn't have grown at this pace.

ppqqrr
14 replies
22h40m

The “artists need to make a living” narrative against piracy is pure deception. Truth is that most artists want nothing more than for their messages to spread as widely as possible, as that is also the most naturally profitable path for them in the long term. It’s only when managerial types get involved the need to turn a quick buck by denying the natural flow of information becomes a primary concern. So pirate away, knowing that nothing of value is lost.

inglor_cz
8 replies
22h35m

I self-publish my books. The audience is decent, I publish shortened audiobook versions for free, but frankly, I like the fact that the paper books themselves are copyrighted and no one can print them extremely cheaply and flood the market with them at my expense.

It would have been natural, but also depressing.

surfingdino
4 replies
22h29m

I stopped self-publishing my books, because as soon as I offered PDFs to those who purchased my paper books the sales of printed copies tanked. Then nobody wanted to pay for PDFs and Amazon screwed my KDP sales (banned my book). The readers felt entitled to free copies and free consultation on the subject of the book. It's really depressing how entitled people feel to other people's creative output or knowledge.

__mharrison__
2 replies
13h2m

Did you just stop publishing altogether?

surfingdino
1 replies
10h44m

Yes. I do not need another book as a CV, which is currently the most viable business model for authors of non-fiction.

meiraleal
0 replies
8h9m

We are probably better off without your marketing content tho

inglor_cz
0 replies
22h16m

That is why I publish freely the audio versions (which only consist of about half of the stories within each book), but not the PDFs.

shagie
1 replies
21h36m

I like the fact that the paper books themselves are copyrighted and no one can print them extremely cheaply and flood the market with them at my expense.

Amazon has a book piracy problem ( 219 points by tosh on July 8, 2022 | 120 comments ) https://news.ycombinator.com/item?id=32026663 https://x.com/fchollet/status/1550930876183166976 (and via Threadreader - https://threadreaderapp.com/thread/1550930876183166976.html ) - also https://news.ycombinator.com/item?id=32210256 ( 665 points by jmillikin on July 24, 2022 | 193 comments )

Pirated books thrive on Amazon — and authors say web giant ignores fraud - https://news.ycombinator.com/item?id=35761641 ( 87 points by vanilla-almond on April 30, 2023 | 79 comments ) https://nypost.com/2022/07/31/pirated-books-thrive-on-amazon...

Amazon caught selling counterfeits of publisher’s computer books—again - https://arstechnica.com/information-technology/2019/02/amazo...

Having something that is paper doesn't mean that no one else can print them cheaply and flood the market. While it might not be at your expense - it certainly isn't something that is making you any money.

inglor_cz
0 replies
21h21m

I don't self-publish on Amazon, though. I print my books in a local printing shop and sell them using my e-shop (Wordpress for blog, Woocommerce for e-shop).

__mharrison__
0 replies
13h3m

I also self publish books.

Most of the audience is decent. But there are some bad actors out there.

And lots of times the biggest book marketplace appears to (intentionally) close their eyes to this problem.

Piracy of my books from the dark web is one thing. Amazon pushing it is another.

Terr_
3 replies
22h30m

The “artists need to make a living” narrative against piracy is pure deception. Truth is that most artists want [...]

I'd like to offer a more moderate option--or perhaps just radical in a different direction.

Artists would like to make a living, and the "deception" comes from how that slogan is used to falsely present the powers-that-be as able, willing, and actively delivering on that goal.

ppqqrr
2 replies
22h11m

Thanks for the clarification - I do not claim that artists don’t want to make a living. My point is that, too often, the “artists need to make a living too” narrative is used by the system that exploits artists.

hluska
1 replies
21h8m

Can you rephrase that without a double negative? I don’t have a clue what you’re trying to say and your explanation makes it worse.

Terr_
0 replies
20h31m

Not parent poster, but I suspect the thesis can be rephrased like:

"Artists do want to make a living, however there's a nuance when it comes to achieving that. Finding enough solid supporters requires such a wide dispersal of their content that any 'anti-piracy' measures are almost always counterproductive, at least when it comes to the interests of artists as opposed to middlemen."

kmeisthax
0 replies
20h49m

It's important to note who is pushing the deception, here. Creative industry is composed of both labor (artists) and capital (publishers). I file artists under labor because their valuable economic resource is time. They make money when people pay them to make art. Unauthorized copying has harms, but the primary effect is that artists have to expect to be paid money up-front, since the only way they get profit participation on the sale of copies is if there's a strictly enforced set of laws to grant a monopoly on copying. That being said, money up-front is still a very common way for artists to get paid, so "artists need to make a living" is a half-truth.

Paying per-copy and agreeing not to copy for some fixed period is more consumer friendly than, say, everyone pooling their money into a giant one-and-done Kickstarter and just trusting that the end result will be good. If your work can be published serially, then something like Patreon might work, but that's impractical for a lot of larger projects. The consumer unfriendliness manifests in the form of risk: who is out the money if something turns out to suck, or worse, doesn't even get made. The traditional "sell copies with a monopoly" model means that if I don't like a work, I just don't buy it. We have reviews to inform people if a thing is good or not, but you can't review a finished work based off the Kickstart campaign. This results in a market dominated by scams of varying degrees, customers who are hesitant to put money into campaigns that might not produce, and artists that can only really make the business model work if they have a lot of social capital and reputation to stake.

I mentioned fancy capitalist words like "risk" and "market", so let's talk about the capitalist side of the business: the publishers. Or "managerial types", as it were. They do not make their money from selling the service of creating art, they make money from selling art that has already been made, which is capital. When Napster was telling people to stop paying for music and just steal it, the publishers shat their pants. An embarrassingly large part of the music business at the time was reissuing old acts on CD[0][1], and even new acts had to sell albums, which is why 90s listeners had to deal with a flood of albums with one good song and 10 terrible ones.

It's specifically the capitalist side of the business that got screwed over the hardest by Napster. What screwed over artists was Spotify, which made music profitable again for the capitalists by turning it into a subscription. A music Boomer[2] accurately summed this up as a faucet pouring water straight into a drain. This is the best way to devalue artists, because it doesn't matter what songs the artists make - just that the publishers control the flow of the songs.

The Spotify mentality has percolated into basically every other form of media over the last decade. It's why you will own nothing and be 'happy', and why every publisher CEO has a boner for generative AI, even as their artists are screaming their heads off about being scraped. Publishers have nominally been stolen from as well, but they don't care, because the theft is in their benefit[3]. It's the exact opposite of the Napster situation. What matters is not what will benefit the artists, nor what the law says. What matters is what will make them richer.

[0] This is also why the SPARS code was a thing for a few years - to distinguish between new recordings made for CD and reissues riding the hype of digital music.

[1] Metallica also found themselves caught on the back foot, mainly because they found out Napster users were trading pre-release soundtracks they'd made. Their reaction made them look like suits for a while, because Metallica had gotten popular through unlicensed copying, though I don't think this read was entirely fair.

[2] https://youtu.be/1bZ0OSEViyo?t=485

[3] I don't think generative AI will replace real artists, but it doesn't matter so long as publishers believe it can.

breck
8 replies
22h59m

Go further.

If you model ideas mathematically, you will see that societies plagued with IPDD (https://breckyunits.com/ipdd.html) will become extinct, because they prolong the lifespan of bad ideas, and those with intellectual freedom, where bad ideas rapidly evolve into good ideas, will rise to the top of the food chain. The equation is simple: ETA! (https://breckyunits.com/eta.html)

Question whether we should even have a concept of "licenses" (hint: we shouldn't). Look up "freedom licenses", which "freed" African Americans used to have to carry around in the 1800's. Think about how future generations will look at us for having a concept of "licenses on ideas". Think about the natural progression of automatic licenses on ideas (copyright act of 1976), to breathing: there is no reason not to require "licenses" to breathe, given that you exhale carbon dioxide molecules just as you exhale "copyrighted" information.

llamaimperative
3 replies
22h9m

Why on earth would IP keep bad ideas around? You're free to make a better idea and let it compete in the market, since by being better it'd definitionally be different.

iczero
1 replies
20h28m

Let's say someone patents, idk, Client-Side Decoration (CSD). People like it, surely, because people use it. Unfortunately, there is drastically reduced space to innovate because nobody else can use that idea anymore. Expecting the patent holder to innovate has proven to be a bad assumption in part because IPR means they have no competition in that space anyways. The idea stays bad because nobody else can make it better.

llamaimperative
0 replies
18h24m

So people like it but it's bad? How exactly are you defining bad?

tempfile
0 replies
1h30m

For the same reason any accumulation of capital allows bad ideas to hang around. You can operate at a temporary loss to weed out new competitors, you can intimidate newcomers with frivolous legal action, you can leverage network effects, you can lobby for regulation that makes it hard for competitors to start up...

surfingdino
1 replies
22h42m

I think you are conflating copyright with patents. Licenses and other forms of intellectual property protection exist so that those who control means of production and distribution pay those who have ideas, or produce creative output.

bediger4000
0 replies
20h41m

I agree they're different, and different still from trademarks, but the common thing is to conflate it all under "Intellectual Property", isn't it?

I'm deeply suspicious of this conflation. I think it's done on purpose, in bad faith, for nefarious reasons.

rileymat2
1 replies
22h21m

Is there any evidence that the equations in the blog post model the real world?

I ask, because these intellectual property protections are intended to incentivize creation. If that incentive overwhelms these models of information sharing and testing frictions then the model is incomplete.

breck
0 replies
21h31m

because these intellectual property protections are intended to incentivize creation

Judge something not by what people say it does, but by what it actually does.

If that incentive overwhelms these models of information sharing and testing frictions then the model is incomplete.

Agreed. But try as I might, I can't find any way theoretically or empirically to model copyrights and patents that show a positive impact on innovation.

Nature's survival of the fittest already provides near infinite incentive to innovate.

Now, I think patents and copyrights had a positive side-effect effect in the early days of the United States because it created a centralized library in the District of Columbia containing all of the latest information across the fledgling nation. But with the Internet, we don't even need that anymore. All the other parts of those laws are harmful and a drain on innovation.

Look at what happened with Windows/Crowdstrike-ultimately another harm caused by closed source, under-evolved "IP protected" ideas. Ironically Microsoft calls Windows their "Intellectual Property" when collecting money, but when that IP harms people, suddenly it's not their property.

Is there any evidence that the equations in the blog post model the real world?

Depends on where you live. If you live in America, evidence is all around you. :)

But here is some hard data, thousands of programming languages ranked by languages most used to build other languages (which gives an objective measure of idea quality):

https://pldb.io/lists/explorer.html#columns=rank~name~id~app...

Utterly dominated by open source langs. Closed source, IP ones are headed for extinction.

rogerclark
7 replies
20h39m

Carmack is a great programmer to be sure. Commander Keen, however, was not a better version of Mario. It was worse than Mario in every way -- art, music, and gameplay are all inferior.

Nobody outside of Gen X PC gamers know what Commander Keen is. Everyone knows what Mario is. While copying may be the way design works, copying only gets you so far.

bongodongobob
2 replies
20h34m

The article didn't say it was better. No one thought it was better. It was just the first time anyone was able to smoothly side scroll on a PC. By copying something, he was able to push the boundaries of the perceived constraints of the technology which I believe is what the article is pointing out.

rogerclark
1 replies
20h32m

"Disappointed, but not defeated, they resolved to build a better version of Mario."

bongodongobob
0 replies
20h28m

Resolved = tried, wished

bigstrat2003
2 replies
19h26m

Disagree. Commander Keen 4-6 are better than Mario, imo.

trentnix
0 replies
4h6m

Even if that were true (and I don't think it is), Commander Keen should be compared to Mario 3 (which still came out over a year earlier than the first Keen), not Mario. And 4-6 are most appropriately compared to Super Mario World, which was released the same year.

Keen was/is great, but Mario 3 and Mario World are on the shortlist for best game ever.

thestepafter
0 replies
14h43m

To add to this Commander Keen was released on a very limited platform. More people were gaming on Nintendo systems than personal computers. If Commander Keen was released on Nintendo things may have gone differently.

robertlagrant
0 replies
19h58m

Millennial here - I played this as a teenager on our ancient-for-the-time family PC.

ljlolel
5 replies
22h52m

I can make cheap, small-scale facsimiles, fangzhipin, to demonstrate some quality of the original. I can make exact replicas, pixel-perfect fuzhipin, to learn how the originals and their creators work. Or I can create shanzhai, unsolicited redesigns, commenting and riffing on the work of others. All these copies have an important role to play in the process of design.

Whether you believe that it’s worthwhile or worthless to copy, whether you think that copies are a valuable part of the design community or a scourge, you are using software, hardware, websites and apps that all owe their existence to copying.

As long as there is design, there will be copying.

fsckboy
4 replies
22h20m

As long as there are new ideas, those without such ideas will copy them

JoshTriplett
1 replies
21h14m

As long as there are ideas, there will be people who claim their "new ideas" have absolutely nothing in them derived from any previous ideas. Such people then scorn others who do not help them maintain the same fiction, and who instead dare to acknowledge that everything builds on what came before.

unraveller
0 replies
29m

How dare you imply the origins of my inspiration are not mysterious. Now what am I going to tell the interviewer when they ask "where did you get the inspiration for that?" they always like my non-answers.

burnished
0 replies
21h56m

I see that you've copied every word you used here, not very original of you

CognitiveLens
0 replies
22h9m

But that take is too narrow - many of the 'great' painters had extensive training in the work of previous masters, frequently copying their works repeatedly in order to develop technique and more deeply engage with what came before. After developing that base skill and understanding, they had a better toolset to express their own originality.

twobitshifter
4 replies
19h55m

There’s been recent discussions on TV news about ’dupe’ specific sites for fashion and home goods. The big fear is that the popularity of dupes will harm the original designers. However, the idea of fashion copyright is only a modern concept. In woodworking if you saw a chair you liked, you may pay for a plan, but then make it yourself as many times as you wanted. A cobbler would look at a shoe and know how to make it for their customer. A tailor can change a collar or stitch to match what anyone wants.There was no demand that every worker have a unique design - everyone understood it was made to order. When it becomes possible to scale a design to worldwide sales, then the claims of uniqueness seem to us to become more important - but should they?

chrstphrknwtn
2 replies
19h34m

Tom Ford commented on the issue of counterfeit and "knock off" products in the fashion industry, he said after some research (I assume by him/his company) they found that the people buying the cheap counterfeit products weren't their customers anyway, and so they weren't losing anything.

lupire
1 replies
18h42m

That's not the problem in fashion. The problem is if their non-customers tarnish the brand and drive customers away.

chrstphrknwtn
0 replies
15h45m

What industry do you think Tom Ford was talking about?

vizzier
0 replies
19h39m

Counterpoint to that though, guilds existed as a different form of control for many hundreds or thousands of years. Instead of controlling what people can make, just control who can make it.

m3kw9
3 replies
21h2m

You sorta want to use affordance, when Apple creates a new type of UI, it’s usually because the introduced new tech. Like recent Samsung copy cat AirPods, they cannot invent a new UI because they are not the innovators, so they need to borrow affordances from Apple.

On why they copy the shape and size, that is the part where you can be more artistic, and it seems they have no taste.

(Affordance meaning using what people already is familiar with so they don’t have to relearn an interface)

esalman
2 replies
21h0m

A lot of the UI features and associated tech that apple introduced in iPhone and iPad last few years lagged Android by a few iterations. What gives?

m3kw9
0 replies
5h6m

Yeah like the customizable home screen and widgets. They ain't perfect, but I think they really wanted it done right for Mobile. The widget itself isn't new in Apple's ecosystem as Macs always had it.

Suppafly
0 replies
1h46m

That's sorta the history of Apple in a nutshell. For every actual innovation, there is a ton of commoditizing things that exist and repackaging them for locked-in Apple consumers that were previously unaware of them.

jjcm
3 replies
20h17m

One of the mistakes I made as a young designer was pushing back against trends and fads. My opinion at the time was that trends that weren't thought out from a position of UX principles were an anti-pattern to follow. As I matured more as a designer, I now think nearly the opposite - not following trends is an anti-pattern, since that's what your users will be used to.

Pull down to refresh is a great example of this. Not visible or discoverable at all, but was all the hype when Tweetie first released it. On paper it's an anti-pattern, but now it's so ingrained as a trend and pattern that it became expected, and is now muscle memory for many users.

The same goes with flat buttons - I used to be quite opposed to them since there was no visual elevation off the page designating it as a button. Now if you create a button with a bevel, users will think it's an ad, not part of the page itself.

Copying leads to harmony in the wider ecosystem, and it creates a defined agreement on what things are are how they work. It's an important part of the user experience.

lupire
2 replies
18h39m

That just your own bad taste.

Pull to refresh is useful and optional.

Flat buttons save precious space on tiny mobile devices.

hardwaresofton
0 replies
17h54m

I can only hope this was a joke/light hearted but anyway

That just your own bad taste.

Please be civil

[link to hn guidelines here]

ranger_danger
0 replies
12h5m

It's been submitted to HN many times but has never spawned any discussion:

It could be because, at least for me personally, I found the first 15 minutes to be a little too boring. Perhaps people just gave up before then.

amadeuspagel
0 replies
21h37m

It's easier to discuss a text, to quote from it, to comment on it -- to remix it, you might say.

indiv0
2 replies
22h12m

Reminds me of one of my favourite video essays -- "Everything is a Remix" [0]. The video and this article cover the same ideas albeit with different examples. Which is funny on a meta level -- the article could be called a remix of the video.

The video (if I recall correctly) goes a bit further, attacking patents/IP law as anti-creative.

[0]: https://www.youtube.com/watch?v=nJPERZDfyWc

scoot
0 replies
18h41m

There's a fine line though. Led Zeplin didn't remix, they flat out ripped off other artists lyrics and melody. Changing the musical genre doesn't wash IMHO.

amelius
0 replies
19h42m

Yes and that Disney copied old fairy tales and made them their own.

doctorpangloss
2 replies
23h6m

"Functionally and aesthetically, the chairs are identical."

Listen dude, go ahead and buy the $145 Modway chair. It's so bad, it is $118 nowadays. It will literally fall apart under your ass. Read the reviews.

MatthiasPortzel
1 replies
18h53m

the chairs are identical

Followed by pictures of two different-looking chairs. IMO the Modway looks notably worse.

Suppafly
0 replies
2h15m

They really aren't different at all though, they only look different because of the finish on the plywood.

xtiansimon
1 replies
6h2m

A manifesto, how retro.

Suppafly
0 replies
1h51m

A manifesto, how retro.

Somewhat unrelated, but it's a shame that manifestos have such a bad rap, most often associated with terrorists and such. There is something sorta nice about sitting down and clearly declaring your thoughts on a subject. It makes sense that people pushed to the edge want to let us know why they are behaving they way they are, but it's a shame that normal people aren't encouraged to reflect upon their thoughts and write them down. Being able to think about a topic and put on paper that these are my thoughts and feelings about $x brings a certain amount of clarity to your thinking and can help other people understand your thinking in a way that has a lot of power. Consider historical documents like the Declaration of Independence, the points are laid out in a way that even if you disagree with them, there is no denying what they are declaring.

raptorpark
1 replies
5h6m

Self taught artist, and I learned by copying my favorite comic books and painters.

Went to art school and a significant part of my art history class dealt in remembering the name of art "movements" which is a veiled way of saying a period when everyone was copying each other. Then of course you learn about the influential artists who heavily borrowed from xyz. Another funny one is "revival" which just means "straight up copy"

This is why I have limited sympathy for the uproar about AI art. It's just cutting through the boring part.

JKCalhoun
0 replies
4h41m

I think it is how every artist learns. Later an artist's "style" then comes to be the bits and pieces they preferred from various artists they copied as they were learning — a kind of Stone Soup style.

myworkinisgood
1 replies
9h30m

Side note: This is why I feel Stallman is more of a visionary and would have much more lasting impact than Jobs had (character flaws of both people not withstanding). Jobs stole and kept market dominance to keep the loot for himself. In medieval time, Jobs would be a raider. Stallman empowered the people and let them have fruits of their own labor.

andruby
0 replies
13m

The analogy doesn't work well. Sure Steve Jobs and Apple shareholders got rich, but everyone that bought a Mac, or iPhone, or iPad was able to benefit from the innovations they brought.

As far as I know, a raider doesn't share or enable others.

eduction
1 replies
20h55m

Steve Jobs didn’t just waltz into Xerox PARC and steal a glimpse at the Alto. That visit was heavily lawyered and PARC got Apple shares as compensation. To summarize this as “stealing” is just incorrect. Lazy work.

ranger_danger
0 replies
11h45m

Except Apple recruited GUI engineers from PARC to work on the Lisa and Macintosh itself. I don't think you can steal any better than that.

eddyzh
1 replies
21h0m

Insightful perspective.

Maybe interesting to point out from what year it is. It looks like 2020.

atoav
1 replies
11h46m

As a former designer with a broad background (graphics, typography, print, web, product design) whenever I read something like this I get the feeling people generally have a misguided idea of what design is.

Many (bad) designers confuse what I would call styling with design. Design is a lot about functionality and how information is organized visually. These two core design points can only be copied if the underlying project is exactly the same in terms of underlying information. But even for two blogs about different topics the question which information needs to be presented how would be different — even if both blogs were using the browser's default CSS. This is the core of design.

Styling is finding colors, shapes proportions etc. All of this of course overlaps with the functional question and the question of organization of information — bigger buttons get more attention and all that — but ultimately you can slap more or less any style on any content. Whether it makes sense is a different question.

andruby
0 replies
17m

I like your description, and it resonates with how I see things.

A lot of people think of aesthetics when they hear "design". But design is about how things work. Everything we use was at some point designed by someone.

In our SaaS company we changed the role of Designers to Product Designers to help people understand it a little better.

zogrodea
0 replies
20h17m

Great article. Reminds me of this quote from RG Collingwood about how pervasive copying has been throughout history, and how the famous names we know to have copied would be baffled about us being shocked.

"Individualism would have it that the work of a genuine artist is altogether ‘original’, that is to say, purely his own work and not in any way that of other artists. The emotions expressed must be simply and solely his own, and so must his way of expressing them.

It is a shock to persons labouring under this prejudice when they find that Shakespeare’s plays, and notably Hamlet, that happy hunting-ground of self-expressionists, are merely adaptations of plays by other writers, scraps of Holinshed, Lives by Plutarch, or excerpts from the Gesta Romanorum; that Handel copied out into his own works whole movements by Arne; that the Scherzo of Beethoven’s C minor Symphony begins by reproducing the Finale of Mozart’s G minor, differently barred; or that Turner was in the habit of lifting his composition from the works of Claude Lorrain. Shakespeare or Handel or Beethoven or Turner would have thought it odd that anybody should be shocked."

I do understand the desire to protect one's work too and find it hard to take a single side.

xiaoape
0 replies
15h34m

Reminded me a short article: https://signalvnoise.com/archives/000324

We’re not designers, or programmers, or information architects, or copywriters, or customer experience consultants, or whatever else people want to call themselves these days… Bottom line: We’re risk managers.
trentnix
0 replies
18h57m

Copying from one source is plagiarism. Copying from multiple sources is research.

thomastjeffery
0 replies
1h51m

The irony of copyright is that it demands copied design.

Want an online menu for your restaurant? Well, you can't just go copying someone else's design; so you must create your own from scratch. Will yours look and behave practically identically to the other? Yes. Will both websites be overall worse quality than if everyone just collaborated on a standard design? Yes. Would it save the world an incredible amount of redundant work to just allow people to copy each others' work? Yes. Who wins in this arrangement? Only those who have already won.

Keep looking at this pattern, and you will enter a deep cavernous rabbit-hole. At the bottom, you will find yourself at the very core of design itself: the goals, philosophies, and systemic failures of every design we use today can be traced back to this point: collaboration must be avoided at all costs. Compatibility is the cardinal sin, and it must be punished.

So we go on, building silos upon silos. When will we ever learn?

---

There is a lot of talk lately for change. They say, "AI will be the end of copyright. It's too important to hold back the potential of AI over a petty argument for intellectual property." I don't believe for a minute that LLMs will ever reach the lofty goal of "General Intelligence". I don't believe for a minute that megacorps like OpenAI, Google, and Meta deserve a free pass to siphon data for profit. So why is it that these words ring true? AI has nothing to do with it: it's design itself that has incredible potential, and we should absolutely stop holding it back. Intellectual Property is nothing more than a demand against progress.

sva_
0 replies
7h47m

(2020)

surfingdino
0 replies
22h32m

The author got lost in his argumentation. He starts with design, but goes off into the lands of open source, patents, and art. It's not a well-written or researched article. Design is not software development is not art.

seanwilson
0 replies
17h18m

In a 2005 forum post, John Carmack explained his thoughts on patents. While patents are framed as protecting inventors, he wrote, that’s seldom how they’re used. Smart programmers working on hard problems tend to come up with the same solutions.

I find this happens in UI/UX design too. When you're trying to come with the best interface for a problem, there's only so many directions that make sense once you've explored the design space and understood all the constraints.

With desktop and mobile interfaces for example, all operating systems and devices have converged on a lot of similar patterns and visuals. I don't think this is because people are unoriginal, but given the constraints, there's only so many decent options to pick from so many designers will inevitably converge on the same solution.

I’m a designer. As a designer, I feel the need to be original.

I'll often come up with a solution on my own after immersing myself in a problem for a while, then after looking at existing work more later, find it's already been done. I'll then sometimes even consider changing my solution so it doesn't look like I copied, but usually there's no obvious other direction you can go in that is close to as good.

rramadass
0 replies
13h58m

Even more strongly; Copying is the way we Learn.

okonomiyaki3000
0 replies
5h6m

I've always said: Good artists borrow, great artists steal. You can quote me on that.

nxobject
0 replies
17h45m

“Start copying what you love. Copy copy copy copy. At the end of the copy you will find yourself.” – Yohji Yamamoto.

lemax
0 replies
11h57m

As a designer, one eventually thinks not about what they liked in other people's work but why it worked. You can derive a design out a compendium of some things that you've seen that you like, but ultimately, to be successful you need to know why what you're copying made sense for its purpose. Perhaps you need to even encounter the same problem; it takes a bit of maturity to copy effectively.

jmdots
0 replies
4h53m

This all great but for small time programmers trying to get a company off the ground, acting defensively is justifiable. It’s not always a question of a guy like Carmack having a good time cloning the big corp thing. Sometimes, and even many times lately it’s big corp obliterating inventive small business by releasing their own copy, or they simply use their monopolistic power to drive them into an inequitable sale.

If you’re small time and have a great idea, you’re better off going stealth and this is its own mitigation against destructive copying.

fasteddie31003
0 replies
13m

This is relevant for me today since we are designing a new house. To go with an architect is looking like between $50k - $100k for basic building schematics and not the build plans. This seems like a lot to me. The route I'm going down now is finding houses I like on Zillow and hiring a Designer on Fivrr to basically copy them and create a 3D model in Revit that can eventually become building plans. So far the Fivrr Designer costs $100 per Zillow house to model into pretty good Revit plans that I could take to a Draftsman in my area to turn into building plans. It feels a little like cheating, but I've been seeing good results so far.

asdasdsddd
0 replies
21h26m

Re: The copied terracottas

Originality is overrated in art, painting restoration usually entails repainting large sections of the original. The image and the ideas far transcends the "original" which is usually reserved for bragging rights for uber rich collectors. The best art is the art you get to enjoy everyday.

analog31
0 replies
19h47m

"Lesser artists borrow, great artists steal." -- Igor Stravinsky

(Probably stolen)

adolph
0 replies
5h0m

But at $145 (the equivalent of $12.78 in 1947) it’s more affordable than the LCW was when it was first manufactured and sold.

The article isn’t explicitly dated (afaict). Using an inflation calculator leads me to believe it was written in 2019 [0]. The same calculator indicates a material deviation from the quoted number: “$145 in 2024 equals $10.16 in 1947.”

Amazingly, the chair is listed on Amazon now at $118.53 [1] (at least for my login/cookies/tracking; price includes shipping estimated at 6 days), the equivalent of $8.31 in 1947, a 60% off sale.

The cost probably has some externality tradeoffs however. Was the wood clear cut by children from thousand year old forests? Was the chair manufactured by prisoners using chemicals known by the state of California to cause cancer?

0. https://www.saving.org/inflation/inflation.php?amount=145&ye...

1. https://www.amazon.com/Modway-EEI-510-WEN-Fathom-Mid-Century...

__mharrison__
0 replies
13h9m

Data folks would do well to find some good visualizations (from the Economist or New York Times) and recreate them.

They will learn a lot from doing so.

Osiris
0 replies
16h12m

There is no such thing as completely original. No matter what, all of your ideas are influenced by your life experience and what you've seen.

Gualdrapo
0 replies
17h33m

Back at uni a teacher used to say everything (in design, at least) has already been made - so yes, "creativity" was an act to put things that already exist and nobody thought about putting together before.

1GZ0
0 replies
11h50m

Good artists copy, great artists steal

- Pablo Picasso