return to table of content

You are not late (2014)

tomcam
55 replies
5d13h

There are always opportunities, at least in the USA.

I started programming in 1985 and thought I was very likely too late. I was the same age as Chris Espinosa, who was Apple's head of documentation at age 14. At that time it was still not perfectly clear that PCs would be regarded as essential to business. I had no one to talk to about computers and programming. I was in a very rough spot in my life and had to teach myself everything, while not being sure I'd have a job in 5 years.

A decade later in 1996 I started at Microsoft, which virtually all of my peers thought had run its course. They were selling stock like crazy. By the time I left 4 years later the stock had gone up 1300%. Last I checked it's gone up yet another 2500% or so--I kept my 1,000 ESPP shares mostly out of nostalgia and thanks to a company that was so good to me.

I bought a website after the first dot com bust of 2001, monetized it by hiring well, and it provided a very good living for well over two decades after that.

This is an incredible time to start a web-based business because there's a surfeit of free and almost free tools for development all the way to deployment.

up2isomorphism
28 replies
5d12h

Give an example in 1985 to show there are opportunities today is not convincing.

tomcam
10 replies
5d10h

I like people with your attitude! Leaves more for people willing to work and to learn things. I upvoted your comment out of pity.

That era represented the collapse of 8 bit microcomputers. They were huge, then nothing. Silicon Valley had boomed, then gone bust. No one had much use for computers except brave financial analysts willing to use the toylike Apple ][ for primitive spreadsheets. No one took the Mac seriously until Postscript and laser printers happened.

In October 1987 Black Monday hit, causing the worst market crash since the Depression. I hit up a bunch of angel investors—a term that didn’t even exist—to raise money for a batch file compiler. A compiler! That supported me and several employees for years.

Point is that you can always make opportunities happen even in hard times.

ryanjshaw
3 replies
5d9h

Point is that you can always make opportunities happen even in hard times.

Always?

I work harder than anybody I know personally. I am constantly learning new things. People ask me all the time how I "know this stuff", and that has been true since I was a kid in a desert town in the middle of nowhere that only had a library to learn from.

I have tried many, many ideas and failed. I'm halfway through life now. I've missed many opportunities in hindsight, likely due to my narrow fixation on details. This is an aspect of HFA, which is something I've only recently learned about myself.

I think what you've said is factually correct. But not everyone is equipped to exploit the opportunities that may exist, and may struggle to be aware of their limitations. Furthermore, life doesnt guarantee you anything - luck is always an element.

Leaves more for people willing to work and to learn things.

I upvoted your comment out of pity.

Even the hardest working people can luck out and become frustrated. That doesn't change the fact that their best stategy is probably not to give up, but maybe be a bit more empathetic?

tomcam
2 replies
5d9h

Thanks for a thought-provoking response. I was careful not to say one can always succeed in hard times. And I certainly didn’t talk about guarantees. But yes, one can make opportunities happen in the States. Sometimes or even most of the times we fail. I spent $1.4 million of my retirement money trying to beat Craigslist. The wife sure wasn’t thrilled. Nor when I tried to take on Gmail.

I was sure lucky, but I put myself in luck’s way as often as possible. That’s after I got away from home in my teens after years of neglect, beatings, depression, suicidal ideation, and sexual abuse.

Be more empathetic to someone being snide after I made a sincere post trying to offer up some hope in hard times? Fuck that noise.

ryanjshaw
1 replies
5d9h

Fair enough! Sharing your story will no doubt help somebody experiencing self doubt, and that could be exactly what somebody out there needs right now.

tomcam
0 replies
5d8h

Dude, I wrote the book on self-doubt! The only thing that ever kept me going before a certain level of success was my observation that although I’m never the smartest guy around, I always know people dumber than I am who succeeded in similar areas by just showing up and grinding. I still overthink and sabotage myself.

waihtis
1 replies
5d10h

I'm not nearly as successful as you but agree wholeheartedly - there's 8 billion humans on the planet with constantly refreshing interests, and it's absolutely preposterous to think this doesn't provide a recurring, increasing source of opportunity

tomcam
0 replies
5d1h

Meh, be careful with the word successful. You have no idea how successful I am. I’m way better off than most people, way poorer than many folks on HN, but I could be a complete jerk, or someone who pissed it all away, or someone who abandoned his family. Those things and so many others would disqualify me as successful in my book.

To me if you’re living in alignment with your values and aren’t too much of a drain on society most of the time, you’re successful.

mr_mitm
1 replies
5d10h

No need to be patronizing. Pointing out that an argument is not a good one doesn't necessarily mean one holds the opposite view.

Just because people though once that situation A was the case while it was actually situation B, and now it seems again that situation A is the case doesn't mean it is necessarily again situation B, only that it might be situation B. It also actually might be situation A. But if it is actually situation B, extrapolating from the past is not a good argument. Doesn't mean that there are no other good arguments.

tomcam
0 replies
5d9h

I was patronizing in response to someone being patronizing to me.

No regrets at all.

jovial_cavalier
1 replies
5d2h

You seem really unpleasant. It's nice that you were successful and all, but I'm not sure why you need to rub it in the face of people who are still trying to be successful.

Things obviously change over time. It's not non sequitur to claim that opportunities have shrunk over time. Bringing up counter-examples of opportunities that existed 40 years ago is not an argument.

I don't know why you think opportunity must be constant. I don't think you would have had much success with your .com startup mentality in 300BC.

tomcam
0 replies
4d18h

Can you tell me where I was trying to rub anything in anyone’s face other than responding to direct insults? The original point of my post was that you’re never too late, and there are tons of opportunities here in the USA. The last thing I want to do is rub anything in anyone’s face, especially considering financial success is highly relative, and has almost no bearing on who you are as a person.

I don't think you would have had much success with your .com startup mentality in 300BC.

Disagree? I’m a survivor. I would have looked around and tried to figure out the best ways not to die and to keep my family safe using what resources I had. Most people are less paranoid than I am and are willing to just get along. My instinct is to get a little ahead so I have a buffer.

asynchronous
8 replies
5d12h

Older people cannot fathom this logic for some reason.

latency-guy2
3 replies
5d11h

Spell it out, what logic specifically?

jovial_cavalier
2 replies
5d2h

People say "There don't seem to be as many opportunities nowadays compared to my parents' generation..." and the response from some older quite successful person who has not had to hunt for opportunities in the current regime is always "You fool! There's opportunity everywhere! Just get a paper route and in 5 years you'll be able to buy a mansion!"

tomcam
0 replies
4d18h

Is it your thought that I called someone a fool?

stavros
0 replies
3d6h

I find it interesting how the older person who hasn't had to hunt for opportunities in the current regime is wrong, but it's taken for granted that the younger person who hasn't had to hunt for opportunities in the old regime is right.

I doubt that people 50 years ago were all rich.

jachee
2 replies
5d11h

Fine.

Give up.

Don’t chase a dream.

Just go find some drudgery for a paycheck. Don’t bother saving. You won’t need it.

You never had a shot anyway. Your ideas are stupid. Any time you think of a thing, someone else has thought of it first.

There’s no point in trying to make anything better. It’s all as good as it can possibly be.

Such a perfect world doesn’t need your contribution; can’t be changed by the likes of you; won’t care, even if you do manage to come up with something novel.

You’re too young to be taken seriously, and too late for everything, so why bother?

So get out there and fulfill your failure, and don’t worry about dreaming.

All the dreamers woke up ages ago.

szundi
0 replies
5d10h

Also even all the little things are already invented. Even locally. Everything, done.

mattgreenrocks
0 replies
5d3h

A pig. In a cage. On antibiotics.

tomcam
0 replies
5d10h

Might just be the other way around

grodriguez100
4 replies
5d10h

I think the point is that he thought it was too late in 1985, then people thought the same ten years later (1996), then the original article (from 2014) tries to explain how 2014 is not too late either (“the best time is now”).

In other words, it is always easy to think it is too late when looking back. But it is never too late if we try to look forward and imagine the things that have not invented yet and which will be around in 10-20 years time.

Gare
3 replies
5d10h

There will always be winners, bu they're obvious only in hindsight.

tomcam
2 replies
5d9h

Nope. It was obvious to me then, though not my coworkers. Microsoft had already done creditably against Oracle and IBM in come-from-behind situations. The press decided it was time to put Microsoft in its place.

Marc Andreesen said glibly that Windows was nothing more than a poorly debugged set of device drivers. The technical press parroted it without any follow-through. When Microsoft came out with Internet Explorer, new versions of Excel, Word, and IIS, I knew they would win. Because device drivers are hard, nobody wrote them better, and Windows was good enough for millions of customers.

I think Microsofties low-key regarded me as a bit dull, because they believed the media had no agenda. I simply looked at the (public) accomplishments of my much younger coworkers and trusted my own observations.

ben_w
1 replies
5d8h

In the late 90s it was also "obvious" that Apple was about to go bankrupt. The entire dot-com bubble was people who thought it was "obvious" that the early internet was already a winner… and most of those companies failed. When that bubble popped, some people thought it was "obvious" that Amazon would be one of the losers in part thanks to their investment in the (failed) pets.com (not to be confused with the current owners of the same domain).

To me, it's been "obvious" for the last US$2 trillion of market capitalisation growth that Apple's going to face monopoly investigations; it's "obvious" that Facebook is, in a deep fundamental sense, not even capable of being compatible with GDPR; and it was "obvious" to me in 2009 that the first true self-driving car sold to the general public (by the quality standard "doesn't come with a steering wheel and that's fine") would be around 2019 or so.

There's been a few places I've worked with young energetic co-workers (and I have been a young and energetic coder myself), and… in many cases those groups failed to make enough money and disbanded.

One result of a lifetime of not being able to predict the markets is the realisation, a decade ago, that my preferences in many areas are almost anti-correlated with the modal buyer: I like what normal people dislike or even loathe, and dislike or even hate what they prefer.

bombcar
0 replies
5d1h

It's also "obvious" to many at the time that the huge names would continue to dominate; names that don't even really exist anymore.

Predictions of how it will be are always worth writing down, to see how correct they are.

Microsoft and Apple are huge today, but it's entirely conceivable that both are shells of themselves in a decade's time.

GuardianCaveman
1 replies
5d11h

The point was people thought it was too late then. And again at microsoft ten years later. I thought it was convincing.

arcanemachiner
0 replies
5d11h

Don't worry, I thought it was.

ryanjshaw
0 replies
5d9h

I agree that this is a weak argument. I think a stronger argument is simply to say: every year we see people start independent, bootstrapped businesses. There's no reason to believe the potential here has changed. The only way to include yourself in that set is to try, and never give up.

EDIT: I will say, it may be a powerful story for anybody experiencing self doubt. Different people need different kinds of motivation.

mchinen
15 replies
5d8h

It's interesting to hear your experience at this point in time. Mine was later but noticeably different.

I started coding in the 90's as a teen releasing shareware, with my first gig at Adobe in 2001, where they paid me $14/hr as an intern. Even though this was rough times for tech, me and my CS peers at UW felt a lot of optimism and enthusiasm about what could be built because the framework still supported idealists and we cared less about the economy (and we were definitely naive). Both the researchers and entrepreneurial types were very curious about inventing new paradigms.

When I talk to younger people now in CS, the enthusiasm seems to be split for 'pure researcher' types and 'entrepreneurial/builder types', with the latter having interest concentrated on what is booming (like AI), and more about what can be built but what will be able to raise large sums or attract more users. I'll caveat this with I don't know to what extent the people I talk to do have a bias, but I do wonder if there are less people willing to explore new frontiers now.

One major difference between now and then is that the fraction of US market cap that is tech, and to a lesser extent, tech's importance in the economy. I wonder if this established leader position somehow could make people less optimistic and willing to explore?

tomcam
4 replies
5d7h

When I talk to younger people now in CS, the enthusiasm seems to be split for 'pure researcher' types and 'entrepreneurial/builder types'

In my view the pure research types who land jobs have always been maybe 1%? UW is research heavy, don’t forget. And at the time I was at Microsoft almost no one had plans to quit and start their own company. Before I started there in 1996 I figured about 80% would—-an early indicator of how distorted my view was from press reports and my own biases. It seems to me that these days the vast majority are happy to be drones. Or am I way off?

In my time most of the employees were very smart students who just kind of fell into programming and were hooked in by Microsoft’s pioneering efforts at sending recruiters to colleges (rare at the time) and especially less-obvious choices such as Waterloo and Harvard.

What kind of shareware did you publish?

mchinen
3 replies
5d6h

It makes sense. I should say I'd estimate 70-90% of the class didn't want to work their own ideas, but would be happy to work at a big company forever, and this was the mindset going into the CS program. A fair number of people within my circle of folks that hung out at the CS lab and lounge and discussed ideas eventually launched successful companies, and I'm probably overindexing on these types with survivorship plus hindsight bias, since they were the most memorable to talk to.

I interned at Microsoft in 2005, and it felt like most people there were just looking for something stable, with a side of Office Space-esque exit thoughts. The culture and scale was different in ways that seem weird to me today - One example is that Bill Gates invited all of the interns to have dinner at his house, which is something that is probably reserved only for startups nowadays, if even.

Re: shareware, I released a crappy checkers twist game for Mac, which was also very educational, and I was proud to have it on the MacAddict Magazine CD. [1] https://www.macintoshrepository.org/5196-checkerwarz

wcarss
2 replies
5d6h

By contrast, in 2009 the several hundred interns were bussed in a police escorted motorcade along the closed down highway from Redmond to the Pacific Science Center for a special showing of some Harry Potter film and catered dinner, after which complimentary Xbox 360s were distributed. It felt like pretty big business.

bombcar
1 replies
5d1h

Everything has a cost, and if the hiring budget is available, things like bussing several hundred interns around may be cheaper than you'd expect, certainly cheaper (even with free Xboxes) than flying hundreds of candidates out for interviews.

Using "today" prices - something like $25-50 per person for the bus, $20 per for the film, $100 for the dinner, $400 for the Xbox - less than $1k all concerned.

TeMPOraL
0 replies
4d18h

Thanks for putting it like this, the numbers took me by surprise. During my university years (a bit over a decade ago), I actually applied for an internship at Microsoft, and they definitely spent more than $1k flying me from Poland to UK and back for an on-site interview. I wasn't that good, my university isn't that good either, surely they could've used that money to incentivize a few local candidates with free Xboxes and come out ahead. Makes me wonder why they bothered with interns from distant lands in the first place. Tech recruitment never made sense to me (though I enjoyed benefiting from its peculiarities).

actionfromafar
4 replies
5d6h

I think it is because the geeks "won". A lot of geeks found each other during the early web days. Nowadays it's not a stigma to be a geek. Everything is more mature and commercially integrated, including mopping up geek-ly minded people into commercial ventures. It's harder today to have an outsized impact from a bedroom and no budget. (Not in absolute numbers, how often this happens, but in relative terms! Many more people are at it nowadays, so some slip through with a run-away success from nothing.)

marcosdumay
3 replies
5d3h

So, in other words, people today are late.

mattgreenrocks
1 replies
5d3h

Not at all. As someone who lived through that: there was much less enthusiasm and acceptance around starting a business. It was seen as harder and the Internet was perceived as less of a sure thing.

You could argue there was more low-hanging fruit in terms of opportunity, but to anyone who was a product of the age, they weren't primed to see it. So the absolute amount of opportunity wasn't substantially different than it was before. Our mind plays tricks on us when we look into the past: "If I had known X, I could've done that." But, you didn't. And business is rife with survivor bias.

actionfromafar
0 replies
5d1h

Yes and no!

Today one is probably late to

"make a random open source project and see it become a pillar of industry",

- but today is the perfect time for

"start a software business and make great money"

latency-guy2
0 replies
4d13h

In the terms of too late to invent the wheel, sure, we can opine about not being that person all day and you'll always be correct. Unfortunately for you, there's more than that which makes the world we live in today and tomorrow.

bobsmith432
3 replies
5d2h

I'm 16 and I want to grow up and revive the concept of a fixed-function graphics card to make graphics cards cheaper and more focused on gaming. I might even look into acquiring the 3dfx brand and reviving it with that goal. AI and crypto whatever might be the buzzwords but they don't interest me.

tavavex
2 replies
5d1h

GPUs are already relatively fixed-function though - they're already designed with only graphics goals in mind, which is why they're so efficient at them. The reason why they were so suitable for AI was kind of a happy coincidence, because the way they work as processors and their high memory bandwidth (both required for good graphics performance) was also perfect for AI.

taneq
1 replies
4d20h

“An array of programmable CPUs with instruction sets tailored towards vector operations” might be specialised but it isn’t fixed function. “Insert vertices, indices, and 2-4 textures. Receive image on screen.” DX7 style is fixed function.

tavavex
0 replies
4d20h

Hence "relatively". Specialized is the more succinct way of describing it though, yes.

DinaCoder99
0 replies
5d1h

I wonder if this established leader position somehow could make people less optimistic and willing to explore?

The market seems like a bad fit for pro-humanity technology—researchers at least have that (albeit fading) potential in their back pocket.

michaelt
4 replies
5d8h

> A decade later in 1996 I started at Microsoft, which virtually all of my peers thought had run its course.

People thought Microsoft had run its course, one year after the release of Windows 95?

Just as MS Word was dominating WordPerfect? And the likes of AutoCAD were dropping support for AIX, Solaris etc? When Photoshop had just released for Windows, and Apple's market share had dropped to about 3% ?

Wow! I know they were kinda late to online stuff, but still - strange how these things turn out, in hindsight!

throwaway828
1 replies
5d7h

There were rumours that the following year would be the year of Linux on the desktop. It had just taken, and was taking, the server world by storm.

Windows 2000 was really important for MS staying relevant in anything outside consumer. It really set up the next two decades.

tomcam
0 replies
5d1h

Next year is totally the year of Linux on the desktop!

tomcam
0 replies
5d7h

People thought Microsoft had run its course, one year after the release of Windows 95?

Not sure if I made this clear, but I am referring to industry publications, and to what seemed to be a vast majority of my colleagues at Microsoft. Experienced business watchers who could read an SEC filing, and who had a mature grasp of business, knew better, but it wasn’t sexy to keep printing opinions like that.

I’m sure Department of Justice problems weren’t helping either. So these kids were exercising stock options like crazy because of the bad press.

These were people in their mid 20s who had entered these amazing jobs just out of college, and who had experienced several years of astonishing growth. They were both internally and externally very good at taking criticism. I think they just overcorrected at the first sign of trouble. I was a decade older, and had been studying business on my own for about 20 years by that point.

I had also worked for many companies that were nowhere near as well put together as Microsoft. Microsoft makes its share of mistakes, but never in ways that jeopardize the company’s existence. That may sound trivial, but if you’ve ever run a company, you’ll understand that it’s a surprisingly rare trait. Just managing not to fuck up too badly is exceedingly rare in the business world, and I think they didn’t quite understand that. Because they were open to criticism, they really couldn’t understand how special the company was at that time. Since they came in fresh from college, they had nothing to compare their jobs to. To them it was what any job should be.

bombcar
0 replies
5d1h

If you lived through it, at a young age, you were on Slashdot or similar places, and Linux was riding high on the server (it was absolutely massacring Unix, and for kids at college, Linux + Samba was a quite powerful combination compared to even a few years before (think: Novell Netware).

Few college kids have experience with really massive enterprise setups and solutions, and so those were just entirely discounted out of hand.

kubb
3 replies
5d10h

Damn, I better buy some tech stock if it’s gonna go up 13x in 4 years.

Actually if everyone just does that, nobody will need to work!

jodrellblank
2 replies
4d16h

NVidia's gone up 13x in the last 4 years.

Bitcoin's up 10x in the last 4 years.

Broadcom's up 6x in the last 4 years, 21x in 10 years.

Facebook/META's up 5x in the last 2 years, 20x in 11 years.

Uber's up 3.5x in 2 years.

Tesla was up 11x between 2019 and 2021.

Microsoft's up 10x in the last 9 years.

Amazon's up 11x in the last 10 years.

Apple's up 10x in the last 10 years.

Oracle's up 3x in the last 10 years.

kubb
1 replies
4d10h

I’m buying some Google stock right now. I have a good feeling about it :) How do I pick the perfect moment to buy and sell?

stavros
0 replies
3d6h

Oh, that's simple, just predict the future with 100% accuracy.

begueradj
0 replies
5d6h

I started programming in 1985 and thought I was very likely too late.

Lots of skilled programmers started coding when they were kids. But that does not mean starting to learn coding when you are 30 years old won't make you an excellent programmer: actually, you can learn even faster at that age because concepts like logic, dividing an issue into several small ones, and iterating step by step until the final solution are more intuitive to you than to a kid.

WD-42
32 replies
5d14h

Not to seem pessimistic, but in the 10 years since this article what have we really gained on the internet that we didn’t have then? Seems like we got a lot more social media and some failed promises from crypto. This is barring the current ai stuff since it’s still really shaking out.

muzani
9 replies
5d14h

There was the mobile and cloud boom. Which resulted in more digital payments (more difficult for crime and corruption), online to offline stuff like ride sharing and e-commerce. Plus a ton of advancements on logistics, especially in developing countries.

I think most of these changes didn't affect developed nations so much, it's probably still good old Walmart and Amazon. But they were lifechanging to developing nations. We had some advancements in the rights of factory workers as they had to match gig workers, and crime dropped drastically in some places because it just wasn't worth it anymore when you could climb out of poverty by delivering food.

JohnMakin
6 replies
5d14h

mobile and cloud was well alive and booming a decade before 2014 (i guess you could very technically argue not cloud but S3 buckets on aws caught fire almost instantaneously)

dasil003
4 replies
5d13h

S3 didn't launch until 2006, iPhone 2007, App Store 2008. In 2004 we had Palms and VPSs, but that doesn't quality as mobile and cloud in any modern sense of the word.

JohnMakin
3 replies
5d12h

so I’ll hope you’ll forgive me 2 years then on my “decade” statement which was not meant as literally as you took it.

There were mobile phones before the iphone. Like a whole ecosystem even! They even had screens! and browsers! and internet!

the major innovation the iphone brought was a touch screen (and a decent camera), which sort of existed, but nothing as smooth and crisp as the iphone. please i really hope people out there dont think the iphone was the invention of the cell phone.

dazzlefruit
1 replies
5d8h

What about the App Store? I never used a Symbian phone but I don’t remember a device that would have allowed as much as the iPhone in the way of custom apps.

muzani
0 replies
5d8h

I used the hell out of Symbian. There were stores and telcos would sell apps too. Games were far better on Symbian because it wasn't a touch screen, it was practically a joystick. We had chat apps and dating apps. I was a thirsty college guy who spent a lot of data on these apps but eventually settled for SMS because data was too expensive. Web experiences sucked; my first job was converting websites to jQuery mobile, which later became obselete thanks to responsive websites.

But you can't build a business like Uber on Symbian. The iPhone brought about all these marketplace apps. They stopped being toys, and became more like mini computers.

toyg
0 replies
5d10h

Parent didn't say "mobile was invented", said "the mobile boom" - which indeed didn't really happen until (a few years) after the iPhone arrived. The amount of money flowing into mobile software is now an order of magnitude (or more) higher than it was in 2007. I'm your average Apple hater, but without them forcibly removing carriers from the software landscape, today we'd have a much smaller mobile market.

It's like differentiating between "the Internet boom" and the invention of networking - nobody is saying the 14k modem was the invention of networking, but there was no "internet boom" before it.

muzani
0 replies
5d8h

I graduated around 2012, Malaysia. A few things I distinctly remembered:

- some college girl doing CS was talking about how cloud is the future. I asked her what cloud was. She didn't know but mentioned the S3 stuff.

- my aunt, a clerk, was talking about how her job was training her to put documents in the cloud. She didn't know what a cloud was either.

- I got the worst career advice of my life. Someone said I should stop doing Android and learn to make BlackBerry apps instead, because that's what all the British people were using. On the other end, someone said I'd regret not making Windows Phone apps because they were superior in every way to Android.

- we were doing SVN to our own servers because this dude Joel put it as part of a checklist, and then we switched to Git because of the cloud thing.

- Uber (and similar) were just blasting off. In my country, it was MyTeksi (2011), later GrabTaxi (2013), then Grab (2016). They were the first "unicorn" in the region, where we hadn't experienced the dot com bubble, and regional VCs would pressure to raise and burn.

- Tech jobs were not really a thing here. Someone said that with my EE degree, I could get a real job as a KFC manager and keep all the IT stuff as a hobby. But in Malaysia, tech triggered hard around 2014. SMEs became startups. Malaysian entrepreneurs from the US were coming back here, becoming angels and VCs because it was easier to invest here than in the US. And GrabTaxi started pushed a wave of hype.

It's probably different in different places. 2014 was probably a later stage in Silicon Valley, but it was quite early here in SE Asia. There's delays.

It's also why I'm optimistic on things like crypto, web3, AI, because the tech is at a mature enough stage to build things on, but it'll be years before that tech is built and many more years before it's a household product. The story really starts when the tech is used by a billion people.

ehnto
1 replies
5d8h

Not sure about e-commerce being a last ten years thing, if anything we kind of lost a lot of e-commerce to the consolidation of online shopping into online markets like Amazon, Etsy, alibaba etc. Also online shopping software itself has consolidated up into the big tech companies, and small vendors are somewhat forced into the Amazon psuedi-monopoly.

muzani
0 replies
5d8h

I was both too early and too late for e-commerce. I started with it before payment gateways were really around. Paypal didn't like us because of high CC fraud. Many customers didn't even have credit/debit cards - they'd literally drive to the bank in the next town to find a cash deposit machine. Many countries didn't even have cash deposit machines. Logistics were too expensive and always breaking things.

But it's too late now because the conglomerates made it too cheap to compete with and the bar for affiliate programs is too high.

I'm not even sure if they're "tech companies" anymore. At least half their manpower is in blue collar jobs.

hot_cereal
4 replies
5d13h

Amongst the other examples, IoT. 10 years ago it was still in its infancy. As an example, Amazon didn't acquire Ring until 2018. beyond just smart home stuff, payments like Apple and Android Pay impact daily life. The EV boom is also a massive part of IoT. Every TV is now a smart TV (which is miserable, but that's a whole other discussion)

And an IoT world has nearly as many drawbacks as it does benefits, but I think it's hard to argue it hasn't changed the way we interact with the internet in our day to day lives.

amrocha
3 replies
5d11h

I don't think the things you're referring to have had nearly the impact you're claiming they had. They've just taken things we already had and made them slightly more convenient.

Ring is just doorbell cameras. Japan has had these for decades. Smart home is a complete failure, nobody uses Alexa or siri or ok google seriously. EVs are just cars but slightly different. Smart TVs just simplify the process of having to buy a Chromecast or plug in a laptop to your TV.

TeMPOraL
2 replies
5d10h

Also the new things are objectively worse for the customers than the old things were. Ring's improvement over doorbell cameras is video quality. Smart TVs may simplify the process of having to buy and plug a Chromecast, but you buy and plug a Chromecast specifically to avoid all the bullshit and planned obsolescence that Smart TVs come with. Etc.

bombcar
1 replies
5d

If you take a true assessment of all these things, you find that for lots of people the Ring doorbell is just a doorbell now, the video part is ignored; the smart home stuff is just a light switch, and the smart TV is configured enough to get to YouTube or whatever.

People, especially tech-types, way over-estimate how much hassle we're willing to put up with day-to-day.

TeMPOraL
0 replies
4d19h

People, especially tech-types, way over-estimate how much hassle we're willing to put up with day-to-day.

They also just as often underestimate it. It's part of why "data driven development" fails. Regular, non-tech users have long been conditioned to assume computers and tech in general is buggy, finicky, and full of annoyances. They use it anyway, and bear the frustration silently[0], only occasionally begging a techie relative or friend to "fix my computer, it's slow now because it got viruses". Devs and PMs look at their telemetry, see users using a feature, and think they like it. They probably don't. They just suffer through it.

As a techie, I have very little tolerance for hassle. Which for IoT, ironically, means I'm running Home Assistant on a Raspberry Pi now, because it's a net save on annoyances - even though it's extra work, it lets me and my family use the "smart" parts of home appliances without frustration.

--

[0] - With who knows how much accumulating "death from thousand papercuts" psychological damage...

Swizec
3 replies
5d14h

but in the 10 years since this article what have we really gained on the internet that we didn’t have then

Figma comes to mind as an obvious standout example. We didn’t even have the technology to support that level of multiplayer computationally heavy UI in the browser back in 2014. No native apps had collaboration that smooth either.

Collaborative [text] document editing in general is a good example. So mundane these days in all the big web-based work apps that we don’t even notice it anymore.

smolder
0 replies
5d13h

No native apps had collaboration that smooth either.

Games have generally been way ahead of business apps in this sense.

pjerem
0 replies
5d9h

Figma is a good product and being in the browser was a huge booster for its adoption but as a product, technically it could have existed 15 years ago as an installable program. With the same success? Probably not.

What made those products boom was cheap and qualitative video calls (so, bandwidth) that are totally necessary to work in real time collaboration.

ndriscoll
0 replies
5d13h

I believe I recall quake 3 running in the browser a few years before 2014. I imagine if we had the technology to run that, we had what was needed to render a wysiwyg UI editor with multiple cursors.

Cerium
3 replies
5d13h

Zoom? Try having a video conference call 10 years ago. I remember going to an HP corporate office (in Taiwan, if I remember correctly) in 2012 where they proudly demonstrated a video conference room that worked well. Setting up calls using WebEx back then was slow, had poor performance, and usually somebody failed to join and had to call in.

forgotusername6
1 replies
5d10h

The main thing that happened here was bandwidth. Back then you'd be lucky if someone had the 2mbps needed for a reasonable HD call. The underlying technology has not really changed since then though.

bombcar
0 replies
5d

And the commidiziation - ten years ago was 2014 and things like GoToMeeting existed, and were "barely capable" but could share screen, audio, and if your connection was good enough, video from your grainy webcam.

But lots of people weren't using that, they were still stuck with telephony-based videoconferencing, with polycom devices in the middle of a conference table and dual ISDN lines and stuff, stuff built out in the prior decade.

And then connections and cameras got good enough that Zoom could "just work" without all the hassle of following the rules, etc.

jmbwell
0 replies
5d4h

In about 2005, while I was IT for a startup, I worked with a counterpart at an investor (Deloitte maybe?) whose job was largely coordinating video calls in advance of meetings. We’d exchange IP addresses and SIP addresses and whatnot so that our Polycom could talk to their Cisco or whatever it all was. She and I would join 30 minutes beforehand to make sure it all worked, stand by during the meeting in case there was trouble, then come back after the execs were finished and shut it all down. The calls were at least as good as Zoom or Teams today, at the expense of two IT people and dedicated equipment. The execs never knew it not to work great. A much simpler time. We have come a long way.

bbor
2 replies
5d11h

Discord and slack figured out messaging, and the web development tooling now is infinitely better than 2014. On the front end, HTML5 and responsive web design were still new in 2013, and React came out in 2013/stabilized in 2015/released Hooks in 2019.

On the backend, Next.js brought us SSR in 2016, MongoDB brought us document databases by winning “fastest growing database engine” in 2013 and 2014, and Docker brought us disposable containers in 2013.

The list is stacked towards older tech but that’s maybe because recent tech hasn’t proven itself yet: svelte in 2020 is still maturing AFAICT, and ive never heard of Vite (2021) or SolidJS (2022). I personally think many exciting non-AI trends are also ramping up now, such as Local First development (formalized in May 2023).

I think that the economy and innovation in general were curtailed due to the climaxing rampant corruption in the US, but the internet is something of an exception IMO.

This is all talking about developer abilities, of course: the constraints of corrupt capitalism mean that many of the actual sites released in the past decade have been terribly bloated and over-featured. But I think that’s partially a consequence of businesses moving everything into mobile-first websites and apps —- you’re gonna see a lot more low-margin websites.

imacomputertoo
1 replies
5d1h

"...due to the climaxing rampant corruption in the US..." What?

"...constraints of corrupt capitalism" caused websites to be boated?

bbor
0 replies
4d20h

The first part was responding to the idea that the internet hasn’t really gotten better for the consumer in the last decade, despite technological advancements. And I see the reason for that being the regulatory capture of the American government, which allows a few massive conglomerates to monopolize and organize cartels in various internet-adjacent domains. Any reasonable version of America would have invoked antitrust a long, long time ago.

The second part is referencing the specific problems most HNers see with the modern internet: giant SPAs with huge packet sizes and questionable, non-accessible, hand-rolled functionality. And I think the reason for that is that our system is barely functioning right now: we made stock buybacks legal again for some reason, and executive pay and shareholder dividends were already absurdly high. Big SPAs that supposedly do everything, and the mobile apps that they mimic, have become a way to lower expenses rather than an additive new medium for business. When was the last time you talked to a business on the phone?

Long story short, capitalism makes bad websites by underpaying, underhiring, and overspeccing.

boppo1
1 replies
5d13h

I'm a painter pursuing traditional-style work. The education system absolutely failed me, and I have seen it fail countless others with the same desire.

The last 10 years have seen a renaissance of academic painting information and education, and social media, particularly Instagram, has been the fuel.

There is so much now that simply wasn't there then. My life would be so much different and happier if I were coming of age now with those desires instead of a decade ago. Nonetheless, it is still dramatically improved by the advancements I described.

I no longer feel so alone. I suspect many people with different niches are enjoying richer lives like I am, due to the last 10 years of internet.

alexyz12
0 replies
1d11h

thanks for sharing. I appreciate it

whycome
0 replies
5d14h

Tinder was technically launched in 2012 but the swiping thing came out in 2013. These tools are an essential part of connection for a large group of people. And you can't really handwave social media aside, it's been absurdly relevant since 2014. The Apple watch launched in 2015 and had led a revolution in wearable internet-connected tech. 4k TVs and subsequently hq streaming wasn't a thing til after 2012.

tppiotrowski
0 replies
5d14h

WebGL2 and now WebGPU is allowing networked access to the GPU. That's new since 2014.

Edit: plus flexbox

faronel
0 replies
5d14h

I had a similar thought but challenged myself to think about the other side. Using this list many companies and thinking about those founded after 2012, because it takes a couple years to enter the mainstream, we can see that there's quite a bit of opportunity. https://en.wikipedia.org/wiki/List_of_largest_Internet_compa....

Social media for sure, but the entirety of AR/VR. AI, and not just GPTs. but recommendation, detection, sentiment, data mining are all things that are 'new'. We can also think about things like online banking or healthcare apps that didn't exist. I was still sending checks in 2014, and I certainly wasn't doing video visits with my doctor. As someone that is middle age, when I ask younger people they point out how much opportunity as a counterpoint to the cynicism I'm seeing here HN.

dartharva
0 replies
5d13h

India - and probably many other countries outside of the Americentric West zone - achieved complete adoption of digital (mobile) payments, shot up internet connectivity to the moon (to the point that it now has the largest population of netizens in the world), made huge strides in FinTech and Digital Learning, achieved complete adoption of digital commerce including app-based ride-hailing and food delivery, and saw the blossoming of a gigantic app/digital services-based economy.

Life has changed radically here as compared to 2014.

caseyross
17 replies
5d13h

While I agree with the spirit of the post, I think that there are better and worse times to start something new, and in retrospect 2014 seems like it was one of those worse times. The period from 2014--2024 was an era where the sheer gravity of the big tech platforms crushed out innovative startups left and right. People with an extreme focus on product polish could succeed, like Slack (est. 2013) and Discord (est. 2015), but it feels like most of the tech-sphere was either just working on half-hearted products towards an inevitable acqui-hire, or fervently trying to mainstream blockchain in a wishful attempt to create an entirely separate, more open ecosystem.

manmal
11 replies
5d12h

You’re using past tense - has this period ended for you?

bbor
8 replies
5d11h

Well, a startup (ish…) is threatening Google, and VC money is much much tighter. So I’d say the Silicon Valley era is officially over. I hope y’all enjoyed it! I for one am a bit excited to see what’s the next big consumer idea to capture the zeitgeist.

tavavex
6 replies
5d9h

Wait, which startup could possibly be threatening the entirety of Google?

bbor
2 replies
5d7h

OpenAI. In my eyes, Google’s power is entirely in a) assets, b) an army of geniuses, and c) a brand that lets it maintain ubiquitous monopoly status over search ads. Google does a lot of other cute stuff, and they desperately want another revenue stream, but the simple truth is that the entire core of the company’s whole operation is search ads. Display ads are on the way out due to privacy concerns, and OpenAI has damaged both b and c.

sokoloff
1 replies
5d7h

I think that c might be inverted. It's the economic power/performance of their search ads that let them maintain a brand rather than the other way around.

bbor
0 replies
4d19h

To clarify: I think the reason Google control search is that the public likes Google, so there's no political will to enforce the law. It's starting to crack now with the "google search is bad now" narrative, along with the classic "google is spying on me" -- together, they've emboldened Biden's justice department enough to bring multiple suits.

I think Google's attitude of free products (Gmail, Google Maps, Android, and Chrome especially) has won it insane amounts of goodwill, and that's all that's keeping them at the top. Having worked at Google, I find it incredibly hard to believe that they have any sort of fundamental technical advantage when it comes to building a search engine. I didn't work on search and obviously didn't look for the secret sauce, so that assessment is based on company culture and attitude alone.

dazzlefruit
0 replies
5d8h

Arguably the entirety you need to threaten is Search and ads. Remove those, what happens to the rest?

aerzen
0 replies
5d8h

maybe openai?

Draiken
0 replies
5d5h

In my view: none.

Realistically, even if it started to happen, the giants would simply swallow it whole, as they've done very effectively in recent times.

Long gone are the times where a small Google could disrupt a field and debunk everything around it.

OpenAI is half owned by Microsoft already. Without those resources they wouldn't be able to realistically compete against Google. So saying they are a "small startup" for me doesn't compute.

In my view this is a bit of wishful thinking. Today we have so many giants with monopolies that realistically competing in their space can only be done with huge amounts of investment. The one resource that has been dried up recently.

ehnto
0 replies
5d8h

I am looking forward to software devs being affordable to industries outside of martech and SaaS encumbents.

I think boutique software producs built in house for non-software businesses could provide immense automation and innovation to small/medium businesses. Not everything needs to be a SaaS, highly boutique software could cater for exactly one business and be a really holistic advantage.

marcosdumay
0 replies
5d3h

I'd say it's close to an end.

The large IT companies are acting in ways that are completely against innovation and self-improvement. I expect them to freeze-up like the old giant corporations that they are.

But the change only started after the US decided to change its free-money-for-rich-people policy, and that was last year.

layer8
0 replies
5d1h

Everyone jumping on AI now is a good opportunity to do something different.

callamdelaney
1 replies
5d8h

The last thing Slack had was an extreme focus on polish. As a chat system, it's hardly functional and far less so than irc which came before. Slack managed to sell chat to big corporations, that was its innovation.

amelius
0 replies
5d8h

And with a little effort Whatsapp can still crush them.

wouldbecouldbe
0 replies
5d5h

Yeah there are, but it's hard to know where you are in. So best just to have fun & create great stuff. If focused on making people's lives better you never really go wrong in the longterm.

szundi
0 replies
5d10h

Also bad economic times are the best: validation is more meaningful and when things start to go again, you have a good cost structure.

beacon294
0 replies
4d23h

2013 was a great time to join in the fray, I think the clarifying point is that you are working on the machine, you are not embodying the success or failure of the machines of that time.

boffinAudio
7 replies
5d8h

As someone who has used the Internet since 1985, I constantly find myself reminded of the fact that the Internet isn't just port 80. The Internet is so much more than just the web, and when someone comes up with a cross platform, powerful application which uses some other port and some other protocol, it will be just as functional on the Internet as any other Web Browser.

We could just as easily produce clients which exchange Lua bytecode. In fact, we do (Games, etc.) .. but we could just as easily build an execution environment (which is what browsers are) that allows a much wider and broader range of application capabilities than the browser.

This, then, is what I have in mind, when I think "I've been on the Internet too long, I've become jaded": actually, the browser is just the lowest common denominator. As soon as some other client application is built which pushes user experience beyond what the browser is capable, the Internet will still be there to host it.

And I find that inspiring, even after 40 years of crud.

TimTheTinker
6 replies
4d23h

As soon as some other client application is built which pushes user experience beyond what the browser is capable

The problem is that the browser is so good at being a ubiquitous, open-source, open-standards-based, cross-platform, near-instant-loading secure remote code execution environment. As Brendan Eich says, always bet on JS and WASM. I would extend that to -- always bet on browsers.

With the amount of change that has occurred in browser technology over the last 30 years, I strongly think the future looks like more improvement in browser technology (and perhaps deprecation/removal of old standards) than an entirely new paradigm. I know various early web pioneers want to build a post-Web application delivery system (notably Tim Berners-Lee and Douglas Crockford), but given the accumulated legacy of modern browsers and web apps and the ongoing investments into browser technology, I don't see how any post-Web system could gain any traction within the next 25 years.

But yes - your point still stands. If anything better than browsers ever appears, the internet will indeed still be there to host it.

BrendanEich
5 replies
4d12h

A couple of ways for a new "browser" to leapfrog:

1/ As guest in current browsers acting as hosts, for the zero-install ubiquity of the Web. This would require WebGPU, Wasm, fast JS<=>Wasm calls, and installable PWAs at least. Gatekeepers notably Apple can and will hinder. Once established, this new hosted app could be nativized to drop the browser dependency.

2/ As new obligate not facultative browser-like app in new device, e.g., XR eyewear. The device would have to take off at very large scale, and its maker(s) would want to do things not easily done via (1). The extant Web would be mapped onto in-world surfaces but the new device and its content stack would support things that go way beyond evolutionary improvements to the Web.

I used to think (2) might happen with Oculus being acquired by Meta, or by Apple eyewear, but I'm less optimistic now. The Big Tech players, especially Apple, can build nice h/w, but it won't get to scale quickly enough to subsume the Web.

TimTheTinker
4 replies
3d20h

As guest in current browsers acting as hosts, for the zero-install ubiquity of the Web.

That sounds promising, except I don't see how the affordance of URLs could be duplicated in that context without introducing competing janky proxies—e.g. https://new-web.com/[insert-new-url-type-here] Or were you thinking that installable PWAs would obviate that?

I wonder if a wasm app framework ("a new way to build web apps without HTML/CSS/JS") would be viable, as it could work seamlessly with existing URLs and people could host apps built in it anywhere.

If that gains traction, introduce a new, "cleaner" build target (sans HTML/CSS/JS linking) and a "faster, more lightweight" client for built apps. Have it use a new URL scheme to match - maybe wasm://. Web-targeted wasm app builds could even check if the URL scheme is available, and switch to it at runtime if it is. Kind of like how web apps sometimes open the ios app if it's installed.

From there, expand into a platform for competing frameworks to target...

TimTheTinker
3 replies
3d18h

Endgame here is browsers run wasm:// applications (not a bad outcome), new client dies, and we still have web browsers.

But maybe we get a more lightweight Electron.

BrendanEich
1 replies
2d20h

At high enough altitude, doesn't matter, but I was counting on PWAs. So Apple bad moves in Europe matter.

BrendanEich
0 replies
2d20h

One observation re: Electron: it was a security nightmare (no sandboxing! then lots of script injection / prototype pollution attack surface), so I think the better path to a host runtime is an extended browser. It has a lot of hard-fought security and privacy defense built in.

silent_cal
6 replies
5d3h

An old coworker of mine had a friend who created an app for tugboat captains, and is now pulling over 200k a year from it (probably more now). There are opportunities, they're just not very "cool". Have to look in unusual places.

ActionHank
5 replies
5d3h

Out of curiosity, what sort of functionality does a tugboat captain need that a regular boat\ship captain wouldn't?

silent_cal
4 replies
5d3h

I honestly have no idea, but I think his customers are mostly moving goods up and down the Mississippi River. So there's probably some specific needs there.

bombcar
1 replies
5d

There's tons of "small money" (read, too small for a VC, quite comfortable for a single developer or small company) to be made helping to automate small businesses, especially things that are built on a lot of independent contractors.

Some VC firm tried to do it for trucking and exploded recently, but there's still needs there that can be filled.

The key is to fill slowly and grow; for an example (and completely guessing) you could have an app that starts out as just a "tugboat parking availability" or whatever, and then you find what the users using it would also like to see.

silent_cal
0 replies
5d

Sounds like a nice strategy

ActionHank
1 replies
5d2h

Tinder for tugboat captains confirmed!

silent_cal
0 replies
5d

Lol

DoreenMichele
5 replies
5d13h

The best time to plant a tree is twenty years ago. The second best time is today.

I don't know why anyone would sit around moping about "If only this were thirty years ago!" If it seems like your idea would be "easy" or something -- if only it were thirty years ago -- then most likely it's because we are where we are and you know what you know now that you wouldn't have known then.

It's like all the people who say things like "Youth is wasted on the young" and "I wish I had known what I know now back when I was seventeen." Yeah, you know it at all because you aren't seventeen.

santoshalper
2 replies
5d13h

I do, because figuring out what is going to be big 30 years from now is hard (arguably impossible to do deterministically), but knowing what is big now is easy, and wishing you could go back and capitalize on that is natural.

lostemptations5
0 replies
5d13h

But not useful-- so that's why planting the "tree" now is more practical

DoreenMichele
0 replies
5d13h

A lot of people also spend scads of time contemplating how they would spend the money if they won the lottery and it's generally a waste of time. They probably aren't going to win the lottery and it usually isn't used as a stepping stone to making real life decisions about, say, what kind of career they want to pursue or how they wish to spend their retirement.

I used to do that sort of thing and then began going "If I value x, why sit around telling myself I can't possibly have that or some piece of that without winning the lottery? Why not do what I can to actually work towards that to some degree, in some fashion?"

So you may get it. That's fine. I don't and that's not commentary on anyone but me and how my mind works.

AnimalMuppet
1 replies
5d2h

If this were thirty years ago instead of today, the opportunity to build X[1] would be wide open, but you'd have to build it with the tools that were available 30 years ago. That might not be nearly as easy as building it with today's tools.

[1] "X" meaning "variable to be substituted for" rather than "company that made a bizarre naming choice for no apparent reason".

bombcar
0 replies
5d

Also you have the clear hindsight - if I go back 30 years I'm going to be developing an iPod or something, not any of the flops from 1994 (even though people might note that there are flops in 1994 that have something quite similar be a big hit in 2004 ...).

It's like the people who talk about how obvious it was to invest in Apple or Microsoft or BRK or whatever "if they'd been there at that time" and yet by definition the companies that will be referred to like that in 20 years time almost certainly exist today, but - how do you find them?

mysterydip
3 replies
5d14h

A great reminder, and interestingly prescient given the AI growth in the past couple years.

jauntywundrkind
2 replies
5d14h

Accurate I guess. Amazing given this was a decade ago!

Because here is the other thing the greybeards in 2044 will tell you: Can you imagine how awesome it would have been to be an entrepreneur in 2014? It was a wide-open frontier! You could pick almost any category X and add some AI to it, put it on the cloud. Few devices had more than one or two sensors in them, unlike the hundreds now. Expectations and barriers were low. It was easy to be the first.

Personally the AI aspect feels like the least compelling aspect of how computing can build in-roads & become genuinely compelling to the world again. Many many many people are on the sprinkle-on-some-AI-and-get-rich path, but it's just incredibly non-contributive to the field, to expanding the frontier to me. Apply deep-learned expert-systems doesn't making computing better on iota, doesn't enhance man machine-symbiosis.

I crave a computing era where we can return to the fundamentals. To me, computing's 90's/aughts boom was because we created a democratized access to all the tech, because OSS helped us all learn and grow together. We never had better leverage to augment intellect (the Douglas Engelbart vision/mission) than that, and that made all the difference. It's hard for me to imagine owned proprietary ML spawning lateral growth & understanding, to me.

tomrod
1 replies
5d14h

I crave a computing era where we can return to the fundamentals.

The core of this comes down to -- will it scale?

This is what I want to see from the OS/OW AI models (open source / open weights) universe (kudos to Hugging Face for facilitating this).

I suspect the OpenAI model approach, where all infra is hidden and the models are hidden behind services, can't scale to meet the demand of what people want out of AI. Directions like Mamba over hungry compute Transformers, SLM as over expensive LLMs, and so on, will commodify AI services as operational frameworks mature. OpenAI gets that, hence why they are listening to consumer demand, but their moat assumption is faulty.

That said, I am optimistic about the future of this technology and others that are currently being mined for breakthroughs.

jauntywundrkind
0 replies
5d13h

I can see better now that this is a distinct take, but the fundamentals to me were about man-machine symbiosis. About making systems that we understand & can be freer for using.

I do think scale is important too. But it's a refinement it feels like, one we've already covered pretty well albeit have to keep re-learning/isn't evenly distributed yet. But making computing that is open & explorable? Go watch an Alan Kay talk, any of them; he'll say what I'm saying: we barely have begun to figure out the actual essence of computing where it opens itself up to us & makes us more potent people. We have been grinding on the same target, without pause for thinking about what general computing systems could be like, and it's the wider view that this is all actually on service of, that will help us.

AI to me is mostly creating our own Dark Forest scenario on our own planet that we have to somehow coexist with. More distraction from fundamentals.

dougmwne
3 replies
5d13h

I feel like the last 10 years have been a maturing phase. There weren’t a lot of opportunities for young upstarts without funding. It seemed like it was the era of big money innovation, burning mountains of cash trying to stake the last few open claims on the app and web ecosystem. And there was the crypto stuff, yuck, many went to prison.

But the real revolution is AI. Thinking back to 2014 and peering forward, it’s unfathomable. If there had been a sci-fi movie, I would have thought it unrealistic. I still think I have no idea where this will take is or how much our industry will change. What a great time to be an entrepreneur.

ildjarn
2 replies
5d10h

Her came out in 2013, so there kind of was a movie.

dougmwne
1 replies
5d2h

Oh that’s true. She was a most excellent chatbot. But I don’t recall her making images, movies or composing music.

I recall that movie seemed cute but probably technologically impossible. Now it’s reality for many users of AI girlfriends.

i_am_a_peasant
2 replies
5d7h

I'm pretty effective at my job (embedded software) and I have a decade of pretty solid experience writing device drivers and debugging some really tough issues. But I always feel like I'm not at all ready for the caliber expected on the US job market. I joined a startup so that I could be pushed harder and to prove to myself I can thrive in an environment where it is expected that you will put in 300% effort.

Who knows maybe I'll give it a go in a couple of years.

Draiken
1 replies
5d5h

You will likely learn this yourself, but having done this transition in the past, the image I had of this world was simply not real.

Startups fake shit all the time, cut corners everywhere (even when they shouldn't) and the hustle culture exists primarily to benefit from employee overwork (free labor).

Coming from a developing country I always thought I would never be as good as "first-world developers". The reality is that the bell curve applies the same way everywhere. Most people aren't exceptional and if you're exceptional in your country, you're most likely exceptional everywhere.

If I were to give myself one advice in the past it would be: you're more than good enough.

Of course, YMMV. Good luck either way mate!

i_am_a_peasant
0 replies
5d1h

Thanks brother. Big love!

The thing about hustling is to only do it for yourself not others.

downWidOutaFite
2 replies
5d13h

This would be a lot more true if we could break up the trillion dollar tech monopolies. And we could do it if we would muster up the political will to do it. Step one is to stop idolizing the crony VCs and billionaires.

iraqmtpizza
1 replies
5d10h

or we could just break up companies involved in anti-competitive practices regardless of their size. you know, rule of law and stuff

fsflover
0 replies
5d4h

We should start with the large companies anyway. They make the most harm.

martynr
1 replies
5d11h

Unlikely to be a popular question in this forum but in my view the most important topic we need to reflect on is “are we really better off with these services?” - obviously at face value there is utility or there wouldn’t be the market penetration but the externalities and issues of market dominance are becoming more apparent and pressing.

And if we agree that better solutions are needed then the question becomes “how do we create the market conditions that support those better outcomes?”

I’d really like to see kk reflect on this article with 10 years hindsight and see if he remains optimistic.

marapuru
0 replies
5d3h

I was thinking something similar. Yes, a lot has happened. But which metric can we use to determine whether this was a successful decade?

jonplackett
1 replies
5d8h

I like this.

It reminds me of an anecdote Tony Robbins tells about a gold miner who just says, “The thing people don’t realised is there’s gold EVERYWHERE”

defrost
0 replies
5d8h

The actual reality is that many places have so little gold that the cost of extraction is far greater than the value of the gold.

You still have to watch your break even.

coffeebeqn
1 replies
5d14h

They already figured out bronze that you can build anything you can imagine out of. Why would I put any more effort into metallurgy ?

amelius
0 replies
5d8h

The other day I was trying to forge a lock but then the guy showed up who gave me the iron ore, asking for 30% of my revenue ...

MichaelRo
1 replies
5d8h

"Opportunities still exist" is not at odds with "the low hanging fruit has been picked".

At the moment both statements above are true, meaning we (the little guy) are on average fucked. Sugar coating it with platitudes and cheap aphorisms doesn't change the reality that opportunities have dried out for all but the wealthy and connected and even for them it's not easy.

AnimalMuppet
0 replies
5d1h

The low-hanging fruit has been picked. But part of that fruit was producing better tools. So we now have taller ladders. As a result, there is new fruit that now is low-hanging.

JohnMakin
1 replies
5d14h

This fun thought only holds true if the internet is not in decline. in 2014, which very arguably was peak internet, I could understand not thinking that, but today, I struggle to imagine how this isn’t the case.

datadrivenangel
0 replies
5d13h

Peak internet was 2015, before the 2016 internet rupture when the digital space become pop culture and cyberspace inverted into the real.

DrNosferatu
1 replies
5d3h

The same freedom should apply to LLMs!

Specially because, initially, the internet only provided content free of charge - now we actually pay subscriptions to use the top LLMs.

adtac
0 replies
5d2h

The opposite is true too. Things that used to be expensive are abundant.

An encyclopaedia would have cost you thousands of dollars, but you can use the greatest and most up to date compilation of human knowledge for free today (if you can deal with a few donation banners from the Wikimedia foundation around December every year).

warthog
0 replies
5d4h

I think another consideration here is not the internet itself but its distribution. Internet economy is not an anarcho capitalist environment anymore where any opportunity is wide open. It is more restricted to big circles or ecosystems controlled by Big Tech.

Take mobile and app stores. There are teams and people that build so many innovative and better solutions to deal with apps and app stores, yet they are simply not allowed.

AI does not change this norm. VR and AR perhaps, but they are also dominated by the same companies. Web3 was a good bet to change it but seems like it is not happening, at least the way we thought it should happen.

I am still optimistic about the future but possibilities are realistically rarer.

Ref: Yanis Varoufakis also talks about this in context of a digital feudalistic society.

tavavex
0 replies
5d9h

I'm young, and talking about years like 1985 feels like talking about some other reality, but to me it still feels like the two years aren't comparable.

Whenever I read about the history of computers and software in the 80s, it feels like there are always mentions of relatively new companies foraging their path, new hardware manufacturers and software developers shaping the brand new home computer industry. Sure, there were some old giants on the playing field, but it almost sounds like anyone with a decent idea could carve out a space for themselves.

2014 though? There were a bunch of opportunities back then, obviously, but it was already deep into the megacorp consolidation era, when large software or web companies owned dozens of unrelated services and had secured an almost endless influx of cash. There were startups, and a few of them became extremely successful, but it feels like most of them were either doomed to fail or be bought out by a larger company. I feel like in this last decade, internet innovation had mostly slowed - the internet of 2004 was extremely different from the internet of 2014, yet 10 more years passed since then and it doesn't feel like as much has changed.

Maybe it's just my imaginary rose-tinted view of the past or something, but it feels like it's harder than ever for a startup to compete with the big players now. The only big exception I can think of is video games - self-publishing anything was probably almost impossible in the 80s, but nowadays we have an endless sea of development and publishing tools for independent developers with fresh new ideas.

Perhaps, there's a completely new field on the horizon that will level the playing field once more, putting everyone back at square one. I think that some industries could get squeezed dry until the next big innovation comes along, or people move onto some other new space.

murbard2
0 replies
5d7h

Well sure, in 2014 you were not too late, but now...

jauntywundrkind
0 replies
5d14h

I have hope for many more turns / revolutions!

Boy have we been in a holding/consolidation pattern. The age of massification has been upon us; getting everyone else online has been the effort, the way to rise. Free online services we can connect to from anywhere has been an amazing change, a totally new expectation that totally changes how computing situates itself in our lives.

At much cost to figuring out further places to pioneer, I feel. We need new cycles with new energy, where lots of people are trying stuff out again. Computing & tech should not stay well settled; we should be trying crazy stuff. It feels like a lifetime ago that Tim O'Reilly was preaching "follow the alpha geeks", look for those very capable folks doing excellently for themselves. That ability to trendspot & try lots of things has been somewhat by these huge scales & systems, but I believe a return to personal-ity has to crop up again sometime. We'll find some fertile terrains where new things are a happening again.

2014 was when thing were actually really setting in place, when the pioneering stage was really giving way to some settlers (and town planners, see: https://blog.gardeviance.org/2015/03/on-pioneers-settlers-to...). There's a lot of roughness, but tech getting it's mojo back may be not far off. It's a challenge though; the reality of creating ever-connected ever-online systems is hard. We have amazing amounts of server we can fit in a pizza box, at incredibly low $/performance, but we're unpracticed at doing it without pain at smaller scales, by ourselves, anew. Trying to create new generative platforms, that serve as a basis to let more value spring up: it needs a strong foundation, so that it can keep "creating more value than it captures," another O'Reilly-ism.

The future is (with hope) exciting!

hasoleju
0 replies
5d9h

Most of the things that made up the internet in 2014 were created in the second half of the internets lifespan at that point in time. Assuming that this trend continues, most of the things people will use in 2044 still have to be invented. You don't now what the new things will be. But looking back we notice that a lot of things came into our lifes recently.

We are so used to these big innovations that it feels like they have always been there.

ffitch
0 replies
5d13h

as a side note, I find it curious how back then technology innovations were largely ignored for years after they mature (over a decade for companies to start claiming domain names), and today technology is hyped even before it becomes practical (blockchain, llms)

emrah
0 replies
5d

I had this revelation a while back as well. Imagine all the people getting born today, and they have some growing up to do. By the time they are ready to go out and do stuff, more time will pass.

There is certainly nothing they can do about the timing of their birth and they will have to do the best they can.

You can imagine being born today and it's even better than that because you are ready to take action already, right now!

d4n0ct
0 replies
3d19h

A few general observations: While there are still opportunities, perhaps fewer in developed and more in developing countries, the QUALITY of opportunities were different back in the days. The SOFTWARE field was still relatively wide open. The barrier-to-entry was being nerdy smart, but otherwise the costs (basic computers, telephone line access, some education) were low, at least in parts of the USA. Simultaneously the potential return was high. Because the information age was just beginning, and the world was hungry for software. People who were smart with software at that right time could literally shape the coming world. Think Microsoft and Google. In comparison, while the hardware side (chipsets, telecom equipment, etc) has grown, it hasn't "exploded", maybe except CPU or GPU, which are hugely capital intensive. You cannot train AI in your garage yet in a way that differentiates your "product" from that of the big players. The demand for pure software is no longer that high. On the hardware side, barriers to "physical" innovations, such as semiconductors, medicine, etc are still high now as ever. The "low" hanging "soft" fruits have been picked. This now feels more like the big mainframe computer era. So while there are still opportunities, they aren't comparable to the past. When science and technology fundamentally changed in the past, from manual labor to machinery to electronics etc, they opened up new fields for human innovation and endeavors. Groundbreaking disruptions are much rarer now, and the only one, AI, opens up a new field more for computers than for people.

anymouse123456
0 replies
5d5h

These conversations about "way back when" and attempting to compare tech opportunities from one time period to another, always seem to focus on the wild opportunities and forget about the equally insane constraints.

Please don't forget that in the late nineties, CPUs were measured in a few hundred MHz, RAM in tens of MBs, and network speeds were kilobytes per second.

FOSS was a new idea and most software cost hundreds or thousands of dollars. People PAID for compilers! Workstations cost many thousands and servers cost millions. These investments would be literally wiped out every year or two by the leading edge of Moore's law.

In the late nineties, I worked for a company that charged $3 million to a big brand to build a website with user forums. It took a year and more than 30 people to launch. We had to buy ~$1m worth of server hardware and fasten it into racks.

Every time constraints change, new opportunities emerge that either weren't possible or just didn't make sense before.

If you're looking for opportunities, keep an eye on observable, recent changes, especially those that haven't yet been broadly distributed.

Look for changes in constraints that are limiting large amounts of valuable activity.

Some random examples in the last 5-10 years (IMO) are:

- LLMs, obviously

- Shared, cloud infrastructure made it trivially inexpensive to launch new business ideas

- High speed computation has fallen in price so much that most applications don't even need cloud infrastructure anymore (IMO)

- The amount of trust many people have in big tech companies is at a nadir, this could be a great time to carve out a beach head in anything they do. Being perceived as trustworthy and NOT THEM could be enough to build a great business.

- Many people seem to feel that social media is net bad for us, figure out how to solve the same problem set (feeling engaged and connected?), with fewer negative trade offs.

- The electronics hobby market has exploded and made new hardware ideas possible, even if budget is limited.

There are a bunch of these kinds of observations that you're uniquely positioned to observe and dig deep on.

Most of them aren't viable, but the fun is in probing around at the edges of what's possible and learning. That's the thing that brings us closer to whatever will work.

Maybe it's just me, but I've noticed that when I'm waking up frustrated and unhappy most days, it has almost always been because I'm ignoring my conscience, which has been trying to tell me that I'm not pointed in the right direction.

aizyuval
0 replies
5d8h

History shows that there’s no such thing as late, as we are humans, there are always vacuum to fill if people take ownership for them. Rebels, inventors, thinkers, hackers. They all emerged in different times by taking ownership.

It is only the perception of this innovation that could change as times go on.