return to table of content

Why Cities: Skylines 2 performs poorly

xyzzy_plugh
71 replies
22h38m

It's kind of stunning that a game of this magnitude is able to go out the door without model LOD.

I suppose the fact that it runs at all is stunning -- surely you could not get away with this a decade or two ago -- but perhaps it speaks to the incredible capabilities of modern hardware. This feels a bit similar to the Electron criticism, where convenience ultimately trumps performance, and users ultimately don't care. I wonder how this will play out in the long run.

Bizarre and at least for me, equally sad. I long for the days of a tuned, polished game engine squeezing every inch of performance out of your PC.

hypeatei
36 replies
22h31m

Valve/Steam should really have some policies around unfinished or unpolished games so that they are forced to be marked as "early access"

It is absolutely ridiculous that these developers can get away with releasing a beta (essentially what it is) and setting the full release price without the end user knowing they're a guinea pig.

solardev
8 replies
22h26m

I think their reviews are punishment enough. "Very Positive" for CS:1 and "Mixed" for CS:2. And if it improves over time, the reviews improve with them!

Cyberpunk was a good example of that. And the graphs make it really easy to see how it's changed over time: https://store.steampowered.com/app/1091500/Cyberpunk_2077/#a...

wincy
3 replies
21h14m

Okay for real, who decided this game could be have the acronym CS? Counterstrike has been one of the most played games since 2000. I don’t even play Counterstrike but it has a huge player base compared to this game. And didn’t Counterstrike 2 literally come out a few weeks ago?

solardev
0 replies
21h9m

Heh, good point.

Also, I really wish Apple chose some other name for its Game Porting Toolkit... hard to find relevant discussions in the sea of "other" GPT talk.

FrostKiwi
0 replies
16h12m

The acronyms are not the same and the article properly writes that.

CS2 -> Counter Strike 2 C:S2 -> Cities: Skylines 2

Dah00n
0 replies
18h29m

To make matters worse when it just released, the two top games on Steam were CS:2 and CS:2.

dvaletin
3 replies
22h20m

Which only incentivize companies to publish unfinished products with “will fix it later” ideology.

solardev
2 replies
22h16m

Is that a big deal? That's easy to avoid if you don't pre-order games and just wait for the day 1 reviews. Even if you did end up with a shitty situation, Steam lets you refund the games with minimal hassle.

On the other hand, there are players who'd rather have the game earlier (like me) than a few months later, despite its launch issues.

The alternative approach -- Baldur's Gate 3 being in Early Access forever -- is fine too, but damned if that wasn't a long wait.

Maybe the compromise is bigger companies being willing to release in Early Access more often. That shouldn't be limited to just indie companies, but any publisher that wants early and broad public feedback.

Especially for a city-builder game (where there isn't really a campaign or spoilers), I don't see why not...

johnnyanmac
1 replies
7h18m

large companies don't have much incentive to use EA. The usual incentive for EA is to make a game cheaper for early users and to get important playtesting done.a big game with a big ad budget, QA testing studio, and internal playteting doesn't need to worry about that.

solardev
0 replies
4h40m

Well, in this case, they were a 30 dev company that knew the performance was going to be so bad, they had to release a warning a week before the game came out. Seems like Early Access may have been appropriate, and maybe saves the review score too.

worldsayshi
7 replies
22h9m

I wonder if releasing a widely anticipated game unfinished is sometimes actually strategically beneficial marketing wise. Perhaps it's a marketing dark pattern?

It makes the game stay in people's minds longer because people keep coming back to it asking "is it good yet, have they fixed it yet?". It kind of feels it has worked like that for Cyberpunk. If it's a finished game on launch day people will quickly make up their minds if it's for them and then move on.

Personally I would be on the fence about buying it even if it was good on launch and I would probably not buy it straight away. But I might just change my mind if I get reminded of it enough times. Then again I felt like that about Cyberpunk as well and I still haven't bought it.

davedx
3 replies
21h59m

But it’s not “half finished”!

It has some performance issues. Not the same thing.

worldsayshi
2 replies
21h44m

Sure, I should've picked a better word there.

ryandrake
1 replies
20h45m

I think half-finished is a good way to describe the state of day-1 releases of games these days. Look back on other games (and non-game software) and measure A. the amount of time between when the developer started and the first release, and then B. the total amount of time it took to get to the final patch. I bet for many, MANY games, A ≤ B/2: They were literally "half-finished" in terms of time, on first release.

johnnyanmac
0 replies
7h34m

Sure, but that's not how consumers measure "finished". By that logic, Binding of Issac wasn't even in Alpha state when it launched in 2011. Since it got its latest update this year. Meanwhile, Sonic 2006 launched once, never got updates, and was buried for 15 years before Sega loosened up. It was by all accounts "finished", in the worst way.

unpolished =/= unfinished.

solardev
2 replies
20h31m

I think Cyberpunk was only able to turn itself around because the studio got so famous with The Witcher and people were willing to give them another chance. If they hadn't been famous already, it'd just have been another rando shitty game on Steam, of which there are thousands...

But then there are stories No Man's Sky too, which had a miraculous turnaround as well. So maybe it can happen sometimes...

bsder
0 replies
19h17m

The anime Cyberpunk: Edgerunners had a significant impact on getting people to look at the game again.

That's a black swan that can't easily be replicated.

Dah00n
0 replies
18h22m

I have a more conspiratorial opinion of Cyberpunks criticism. If you look at Cities Skylines 2 and many other AAA titles, they are just as bad or worse. Yet, the one company that sought to fight the way other developers and publishers behaved got a shitload of criticism for what was - in my opinion - a much smaller problem. CS:2 is much worse, Starfield is way more buggy, etc.

In my opinion, Cyberpunk was punked by the gaming industry to harm CD Project Red. I have no proof at all, of course.

jrajav
5 replies
22h24m

Read the reviews and don't buy it. This works fantastically as a punishment already without ham-handed, opaque moderation.

hypeatei
2 replies
22h20m

Early access is not a "punishment" though. It's a system that already exists on Steam.

joe_guy
1 replies
22h14m

But when used in the way you're describing, valve forcing it in a developer instead of a developer opting in, it becomes a form of punishment.

Dalewyn
0 replies
21h12m

Early Access is a "punishment" purely because game devs and publishers use it as an excuse to sell incomplete, broken products.

The terrible reputation is self-inflicted and deserved.

Dah00n
1 replies
18h31m

If this works fantastically, then why is it at the top sellers list?

https://store.steampowered.com/search/?supportedlang=english...

FranksTV
0 replies
16h13m

Because the problems are vastly overstated. I've been playing fine since launch without any issues.

athorax
4 replies
22h29m

Valve/steam should absolutely not be doing that

rychco
2 replies
22h24m

Why not? There's already a hardware survey & they could easily have an opt-in system that reports the user's average framerate while playing games. If the average hardware specs can't run that game >=60fps >=90% of the time on any graphical setting then it's beyond fair to give it a "Hardware reports indicate that this game performs poorly" label.

solardev
0 replies
22h11m

I like that!

dgunay
0 replies
18h7m

It's high time telemetry served users tbh. I would 100% opt-in to telemetry if it would tattle on badly made or misbehaving software.

hypeatei
0 replies
22h27m

Why not? They're still able to list the game and sell it.

I don't see the issue with making it more clear to end users that they're beta testing a game.

davedx
3 replies
22h0m

This old chestnut again.

Software is not “essentially a beta” because it doesn’t meet a bunch of entitled users’ arbitrary definitions of “finished”.

A game having some performance issues doesn’t mean you’re a beta tester.

Did you even buy this game? I suspect not.

hypeatei
2 replies
19h40m

No, I didn't buy it because it's $50 and you have to follow guides and tricks to get it running optimally. That's not what I expect from a game released at full price.

FranksTV
1 replies
16h13m

You have a lot of opinions about a game you've never played.

hypeatei
0 replies
15h57m

Enlighten me.

chc
1 replies
22h29m

They let you return the game no questions asked if you haven't played it for two hours.

hypeatei
0 replies
22h24m

True. Marking it early access would just save more peoples time and be more explicit about the current state of the game.

geraldhh
0 replies
21h57m

market forces ...

but yea, source2 engine could have used some more love before going live

MagicMoonlight
0 replies
22h7m

Yeah they need to start moderating quality.

Pannoniae
15 replies
22h36m

Don't forget the part where they use web tech and waste draw calls like crazy on the UI. These things should literally be banned.

edit: not web tech should be banned, but releasing a game with horrible optimisation like this, either by the store selling the game or by the law

capableweb
12 replies
22h29m

Where are you getting this from? I'm literally sitting with the game open right now with the Chrome Devtools connected to it, and I'm seeing no unnecessary modifications on the DOM side of things.

Could be that the integrated the Gameface library incorrectly I guess? Still interested in more details from you.

Pannoniae
8 replies
22h23m

Directly from the article:

"The last remaining draw calls are used to render all of the different UI elements, both the ones that are drawn into the world as well as the more traditional UI elements like the bottom bar and other controls. Quite a lot of draw calls are used for the Gameface-powered UI elements, though ultimately these calls are very fast compared to the rest of the rendering process. "

With this minimalistic, flat-style soulless UI, the correct number of draw calls spent on UI should be single digits....

capableweb
6 replies
22h19m

Not sure what kind of projects you've worked on before, but the ones I've been involved in, you wouldn't spend time optimizing something taking <1% of render time when other parts are heavily affecting the final render time for each frame.

Why on earth would they try to optimize how the UI renders when they're having big issues elsewhere?

Pannoniae
5 replies
22h9m

Sorry, my initial comment probably come off quite differently. It's not that "using web UI" is the reason why this game has awful performance, that's more like a bellwether for the studio's priorities.

It's absolutely not the most important, or even in the top 10 most important problems here, but it shows really illustratively how much they care about making a game which performs in an acceptable way. (which is: not that much)

Also, it's not even one or two high-poly models dragging the performance down, what I was aiming at is that the game suffers from death by a thousand cuts - the LoDs are only a part of the issue, almost every part of the game is done in a sub-optimal way. So while the UI is not a significant part of the frame time, if they fix the most glaring performance issues, they will find that there won't be a silver bullet, the game is just a pile of small performance problems all the way down.

jsnell
2 replies
21h45m

> It's not that "using web UI" is the reason why this game has awful performance, that's more like a bellwether for the studio's priorities.

All it shows is that optimizing something that was already fast enough was not a priority. But why would you want it to be?

kbelder
1 replies
10h52m

Sometimes you can take those small design decisions as indicative of how decisions are made in a company.

I remember back when Apple removed the SD card slot from their phone. It wasn't a deal-breaking change, because their phones had lots of memory built in... but at the time, it seemed to me that the decision makers that did that, would continue to make similar decisions, each one making me unhappier and unhappier. So I bought accordingly. And I feel justified in retrospect.

Way back in the design stages of this game, somebody though that using web tech for their UI was a good idea, and it didn't get vetoed. That indicates bad judgement somewhere in the chain. Who knows whether it's developers, managers, or what... but the best predictor of bad decisions is previous bad decisions. It's at least a dysfunction smell.

jsnell
0 replies
6h41m

But you're just asserting that it's showing bad judgment! Why is it a bad idea? Certainly not due to performance. Nor due to productivity.

ripper1138
0 replies
20h35m

Respectfully, just take the L on your original comment and move on. It’s ok to be wrong.

DonHopkins
0 replies
19h16m

>"These things should literally be banned."

Not "metaphorically", but "literally"? Or are you using "literally" in its non-literal sense? And "banned", not "discouraged", or simply "ridiculed" like you're trying to do?

That is literally (to use the term in its literal sense) an extremely brash statement, quite a lot to walk back in reverse into the shrubberies. Have you actually tried to develop a UI in Unity that approaches the quality you can easily (and cheaply and quickly and maintainably) implement in a web browser? And have you ever tried to find someone to hire who was qualified to do that (and then put them to work on the UI instead of the game itself), compared to trying to find someone to hire who can whip out a high quality performant web user interface in a snap, that you can also use on your web site?

Not to mention that you used web tech to call for the literal banning of web tech.

alright2565
0 replies
22h16m

The author calls out render passes that take 100us, and considers this pass too fast to give a number to.

Why does it matter if it's 5 render calls or 500? The developers clearly have plenty of work to do optimizing the other 70ms, it doesn't make sense for them to spend any time working on this.

rstat1
2 replies
22h16m

"Quite a lot of draw calls are used for the Gameface-powered UI elements, though ultimately these calls are very fast compared to the rest of the rendering process."

Literally quoted from the article. Standard 2D UI like that can be done in as little a single draw call (or so I have read, never actually done it)

lyu07282
1 replies
22h1m

If you composite on the CPU I guess? no that React/Webpack UI is actually a pretty good solution to complex game UIs. It offers great DX while the performance penalty is miniscule compared to a huge deferred render pipeline. Btw the last Sim City used the web platform for UI too.

rstat1
0 replies
20h6m

the last Sim City is not something that should be held as a model of what to do here.

Great "DX" now when your building it, but good luck maintaining it over the long term.

johnnyanmac
0 replies
6h46m

Ehh, I've had ideas to try and bridge the web stack to game development. As much as I hate JS, I've hated every other GUI system in games even more, except IMGUI (which isn't necessarily flexible enough for production UI). So why not utilize the decades of knowledge from an industry who's entire paradigm is improving UI workflows.

Of course, it's a butt ton easier said than done, but if the industry could utilize flash for years after it died, I see a glimmer of hope.

>releasing a game with horrible optimisation like this, either by the store selling the game or by the law

you really want the government to regulate video games because you can't run a frame in 16ms? Talk about a slippery slope.

chc
0 replies
22h30m

Using web tech for the UI isn't a problem here. The article, when measuring the performance impact of different rendering phases, describes the time the UI requires as "an irrelevant amount of time."

mvdtnz
7 replies
22h16m

It's stunning and completely unacceptable. This is a product that is not fit for purpose. I hope the developers are embarrassed by what they have produced.

TonyTrapp
5 replies
21h3m

This is rarely developers' fault. You can bet they wanted to deliver the best product possible, but were not given the time needed to do that by upper management.

hipadev23
2 replies
20h53m

Why is it impossible that maybe Cities Skylines simply has shitty developers?

vore
0 replies
15h37m

Because if you've worked on any large scale software project, technical issues like this are almost always because of priorities and timeline.

johnnyanmac
0 replies
6h50m

Not impossible, but it's an shallow dismissal often used by consumers who don't know how the sausage is made. Large claims require large proof.

IME, the games industry isn't like other pieces of the tech industry where, say, 30% of devs clock in and clock out, collecting a paycheck. it's much rarer (but not impossible) to find someone still in the industry after 5+ years and not passionate. The churn is enourmous and the kinds of management in there are always pushing to get more out of you.

You don't stick with that because you are low quality or lazy. You do it for the people you work with or the product you work on. Naively, perhaps. But there is certainly fulfillment you get out of it.

mvdtnz
1 replies
20h5m

I didn't blame the developers. I have been involved in projects that I'm embarrassed by even though the worst decisions were the ones made by upper management (hell I worked on the new Jira front end, a continuing source of humiliation).

johnnyanmac
0 replies
6h55m

I just become apathetic. Shame implies that I have control, I'm the face of the game, and that the game's rep directly affects my day to day.

I'm a cog and even if I break the machine has plenty of redundancy. Nothing to be ashamed about. I got my experiences, expanded network, compensaion, and war stories, they got my X years of labor. It's all a transaction towards each others' true end goals.

vore
0 replies
15h38m

I think the developers are too overworked to feel embarrassment from a random person on the internet criticizing them.

rkagerer
4 replies
22h4m

Can someone expand on exactly what is meant by "model LOD" in this context?

Does the commenter mean they should have implemented a system to reduce texture resolution or polygon count dynamically, eg. depending on what's in view or how far away it is? That the artists should have made multiple version of assets with coarse variants removing things like computer cables from desks in buildings?

solardev
0 replies
20h26m

There were already some great explanations in the replies, but here are a few videos too:

Basic overview: https://www.youtube.com/watch?v=mIkIMgEVnX0

Or a more detailed one: https://www.youtube.com/watch?v=TwaS5YuTTA0

It's been a standard technique in video games for... decades now.

navjack27
0 replies
21h47m

Traditionally that is a thing that is done by modelers for video games yes.

davedx
0 replies
21h48m

Yes, that. It’s short for “level of detail”.

capableweb
0 replies
21h19m

> Can someone expand on exactly what is meant by "model LOD" in this context?

Back in the day, games just had one version of each model, that gets loaded or not.

Nowadays, games with lots of models and huge amount of detail lets each model have multiple different versions, with their own LOD (Level of Detail).

So if you see a tree from far away, it might be 20 vertices because you're far away from it so you wouldn't see the details anyways. But if you're right next to it, it might have 20,000 vertices instead.

It's an optimization technique to not send too much geometry to the GPU.

tlonny
1 replies
22h9m

> I long for the days of a tuned, polished game engine squeezing every inch of performance out of your PC.

Have you heard of Factorio :)

waveBidder
0 replies
21h50m

how do they keep finding things to improve in their FridayFunFacts? the fame is an absolute gem

pzlarsson
0 replies
20h45m

It is astonishing indeed. In the long run I hope software catches up with the majority of other industries which has already realized that minimizing waste is a good idea. It might take a while but unless we get room temperature super conductivity or some other revolutionary tech first we will start thinking about efficiency again sooner or later.

asdff
0 replies
19h7m

I don't remember game engines ever being polished and tuned. If you needed optimization it was always on the community to figure it out. Usually they'd do a good job though and make it go up 10 fold compared to what the game devs came up with.

Zetobal
0 replies
21h59m

The last one wasn't better either... most games nowadays have game breaking bugs at launch.

wilg
60 replies
22h20m

It sounds like these issues are relatively fixable. It's a classic victim of the Unity engine's tech debt though. I use Unity myself and they desperately need to decide on how they want people to make video games in their engine. They can't have three rendering pipelines and two ways of adding game logic that have a complicated matrix of interactions and missing features. And not great documentation and a bad bug reporting process.

Someone1234
31 replies
22h16m

It makes one wonder what their internal employee incentives are and if they're problematic.

Microsoft has a similar problem where nobody gets promoted from fixing bugs or maintaining stuff, everyone gets rewarded for new innovative [thing] so every two-three years there's a completely new UI framework or similar.

Although I feel like wanting to start-a-new is a common tech problem, where there are problems and everyone wants to just reboot to "fix" it rather than fixing it head-on inc. backwards compatibility headaches.

cipheredStones
24 replies
21h59m

> Microsoft has a similar problem where nobody gets promoted from fixing bugs or maintaining stuff, everyone gets rewarded for new innovative [thing] so every two-three years there's a completely new UI framework or similar.

Is there any big (or even medium-sized) company where this isn't true? I feel like it's just a rule of corporate culture that flashy overpromising projects get you promoted and regularly doing important but mundane and hard-to-measure things gets you PIP'd.

throw3823423
10 replies
21h43m

It's a matter of letting things degrade so that the maintenance becomes outright firefighting. I am currently working on a project where a processing pipeline has a maximum practical throughput of 1x, and a median day's for said pipeline is... 0.95x. So any outage becomes unrecoverable. Getting that project approved 6 month from now would have been basically impossible. Right now, it's valued at a promotion-level difficulty instead.

At another job, at a financial firm I got a big bonus after I went live on November 28th with an upgrade that let a system 10x their max throughput, and scaled linearly instead of being completely stuck. at their 1x. Median number of requests per second received in dec 1st? 1.8x... the system would have failed under load, causing significant losses to the company.

Prevention is underrated, but firefighting heroics are so well regarded that sometimes it might even be worthwhile to be the arsonist

piaste
8 replies
21h24m

Intuitively, "fixing life-or-death disasterss is more visible and gets better rewards than preventing them" doesn't seem like it should be a unique problem of software engineering. Any engineering or technical discipline, executed as part of a large company, ought to have the potential for this particular dysfunction.

So I wonder: do the same dynamics appear in any non-software companies? If not, why not? If yes, have they already found a way to solve them?

harimau777
4 replies
21h5m

Outside of software, people designing technology are engineers. Although by no means perfect, engineers generally have more ability to push back against bad technical decisions.

Engineers are also generally encultured into a professional culture that emphasizes disciplined engineering practices and technical excellence. On the other hand, modern software development culture actively discourages these traits. For example, taking the time to do design is labeled as "waterfall", YAGNI sentiment, opposition to algorithms interviews, opposition to "complicated" functional programming techniques, etc.

ghaff
3 replies
20h19m

That's a very idealistic black-and-white view of the world.

A huge number of roles casually use the "engineer" moniker and a lot of people who actually have engineering degrees of some sort, even advanced degrees from top schools, are not licensed and don't necessarily follow rigid processes (e.g. structural analyses) on a day to day basis.

As someone who does have engineering degrees outside of software, I have zero problem with the software engineer term--at least for anyone who does have some education in basic principles and practices.

rat9988
2 replies
19h30m

I have yet to see, with the exception of the software world, engineering with such loose process.

nvm0n2
0 replies
2h3m

It's common to start constructing buildings before the design is even complete. And there can be huge "tech debt" disasters in civil engineering. Berlin Airport is one famous example.

ghaff
0 replies
19h1m

As someone who was a mechanical engineer in the oil business, I think you have a very positive view of engineering processes in general.

dmoy
1 replies
21h15m

> If yes, have they already found a way to solve them?

A long history of blood, lawsuits, and regulations.

Preventing a building from collapsing is done ahead of time, because buildings have previously collapsed, and cost a lot of lives / money etc.

harimau777
0 replies
21h3m

I remember my very first day of studying engineering, the professor said: "Do you know the difference between an engineer and a doctor? When a doctor messes up, people die. When an engineer messes up LOTS of people die."

nostrademons
0 replies
19h58m

How do you think we got into this climate change mess?

userinanother
0 replies
19h58m

Yeah but if you had a release target of dec 15 and it crashed dec 1st and you could have brought it home by the 7th you would have been a bigger winner. Tragedy prevented is tragedy forgotten. No lessons were learned

ajmurmann
3 replies
21h49m

Is it only big companies? The fact that many companies in our industry need to do "bug squash" events because we are unable to prioritize bugs properly speaks books to meet.

Jochim
2 replies
21h5m

Top down decision making, typically by non-technical people who often have no idea what software development even involves.

Eventually things get so bad that there's no choice but to abandon feature work to fix them.

The business loses out multiple times. Feature work slows down as developers are forced to waste time finding workarounds for debt and bugs. The improvements/fixes take more time than they would have due to layers of crap being piled on top, and the event that forces a clean up generally has financial or reputational consequence.

Collaborative decision making is the only way around this. Most engineers understand that improvements must be balanced with feature work.

I find it very strange that the industry operates in the way it does. Where the people with the most knowledge of the requirements and repercussions are so often stripped of any decision making power.

ghaff
1 replies
20h14m

This is pretty much a universal thing--whether it's software development or home maintenance. It's really tempting to kick the can down the road to the point where 1.) You HAVE to do something; 2.) It's not your problem any longer; or 3.) Something happens that the can doesn't matter any more.

I won't say procrastination is a virtue. But sometimes the deferred task really does cease to matter.

3seashells
0 replies
9h43m

At least we wouldn't do that on a planetary scale, right?

hutzlibu
1 replies
21h40m

I think it is a bit tricky to get the incentives right ( since the bookkeeping people like to quantize everything). If you reward finding and fixing bugs too much - you might push developers to write more sloppy code in the first place. Because then those who loudly fix their own written mess gets promoted - and those who quietly write solid code gets overlooked.

xctr94
0 replies
21h32m

Goodhart’s law at work, or “why you shouldn’t force information workers to chase after arbitrary metrics”. Basecamp has been famously just letting people do good work, on their terms, without KPIs.

I will preemptively agree that this isn’t possible everywhere; but if you create a good work environment where people don’t feel like puppets executing the PM’s vision, they might actually care and want to do a solid day’s work (which we’re wired for).

brucethemoose2
1 replies
21h24m

> Is there any big (or even medium-sized) company where this isn't true?

Valve?

uolmir
0 replies
20h56m

From everything I've read Valve has exactly the same problem. Stack rating isn't immune. New features still get rewarded the most.

flukus
0 replies
20h30m

It seems endemic, especially everywhere that's not a product company. I think it was mythical man month (maybe earlier) that pointed out the 90% of the cost of software is in maintenance, yet 50 years on this cost isn't accounted for in project planning.

Consultancies are by far the worst, a project is done and everyone moves on, yet the clients still expect quick fixes and the occasional added feature but there's no one familiar with the code base.

Developers don't help either, a lot move from green field to green field like locusts and never learn the lessons of maintaining something, so they make the same mistakes over and over again.

dymk
0 replies
21h41m

Facebook was pretty good about this on the infra teams. No, not perfect, but a lot better than the other big companies I was exposed to.

If anything, big companies are better about tech-debt squashing, and it's the little tiny companies and startups that are, on average, spending less time on it.

bluedino
0 replies
21h32m

I spent a few weeks migrating and then fixing a bunch of bugs in 20-year old Perl codebase (cyber security had their sights set on it). Basically used by a huge amount of people to record data for all kinds of processes at work.

Original developer is long gone. Me and another guy are two of the only people (we aren't a tech company) who can re-learn Perl, upgrade multiple versions of Linux/Apache/MySQL, make everything else work like Kerberos etc...

Or maybe I'm one of the only people dumb enough to take it on.

Either way, nobody will get so much as an attaboy at the next department meeting. But, they'll know who to go to the next time some other project is resurrected from the depths of hell and needs to be brought up to date.

bandrami
0 replies
14h6m

Aviation. Software will often spend ten times as long in QA and testing as it will in principal development.

Mistletoe
0 replies
19h56m

https://www.thepeoplespace.com/practice/articles/leadership-...

It’s very rare, this is one of the only places I can imagine something like that happening.

capableweb
2 replies
21h48m

The developers of Cities Skylines has less than 50 employees in total, it's a small developer based in Finland (Colossal Order), I doubt they have those sort of issues at that scale, that's usually something that happens with medium/large companies.

Edit: seems I misunderstood, ignore me

wilg
0 replies
21h46m

Talking about Unity, not Colossal Order.

Epa095
0 replies
21h46m

Unity, not cities skylines.

thrillgore
0 replies
20h43m

We're weeks past a very public pricing change that cost Unity market reach amidst competitors and open source projects; and that led to a CEO change. There are problems beyond what the employees can realistically fix.

johnnyanmac
0 replies
8h46m

> It makes one wonder what their internal employee incentives are and if they're problematic.

not a very strong one until recently. They implemented the FAANG esque "levels" in late 2021. Promotion lines weren't too atypical from FAANG, but the new system was not around long enough to cause the problems people complain about today.

The biggest issue IME was that teams were isolated. Maybe DOTS should have talked more with strongly integrating HDRP/URP into its workflow, but DOTS was busy getting off the ground itself. HDRP and URP are two very different teams and they were trying to solve very different problems. Perhaps that was a bad move for the people it was serving, who'd want to have the flexibility to migrate to/from HDRP and URP. There weren't really much product management that was trying to make the engine cohesive, so you end up with a rendering core and a bunch of different plugins with different philosophies and whatnot.

>I feel like wanting to start-a-new is a common tech problem, where there are problems and everyone wants to just reboot to "fix" it rather than fixing it head-on inc. backwards compatibility headaches.

to some extent, yes. I think the one huge downside of Unity compared to modern companies is its antiquated CI/CD. Getting changes into the core c++ engine took 10x longer than it really needed to. iterations on repo builds were slow because Unity simply didn't cough out the money for proper server farms, and the interface to interact with the status of PRs felt like it was from 2005. Much of DOTS was iterated upon separately on with modern Jira/Github/etc. pipelines and the DOTS repo was very lean (and it was public too... until it wasn't. I think they moved it in 2022?), moving to make a change to the core engine was like traveling back in time 15 years ago.

Legacy code is a pain as is. And Unity definitely needed to revamp some non-feature workflows before it could really dig into the core Unity engine issues.

hnthrowaway0315
0 replies
20h14m

It's a combination of team not given enough time amd headcount to maintain and develop a product and another team's manager wants to grab a fief.

So old products are thrown away while new products with similar functionalities are being created.

Both teams are happy. The users suffer.

tus666
11 replies
22h1m

It's a classic victim of shitty, shitty software developers who blames tools rather than taking ownership.

Or shitty software dev companies that push out crap to meet marketing deadlines.

Either way, take your money elsewhere.

gamblor956
10 replies
21h1m

Given that a number of other Unity-based games have had the same or similar performance issues, including KSP1, the Endless games, and others, it seems the problem is very much that Cities Skylines 2 is hitting up against the performance limits that the Unity engine is capable of without custom modifications to the engine-layer codebase.

MrLeap
5 replies
19h50m

I have personally been responsible for optimizing unity games you haven't heard issues like this about ;)

This write-up really points the finger at not solving occlusion culling or having good LOD discipline.

Give a person a dedicated optimization mandate and you can avoid most of this. One of the first things I do when I'm profiling is to sort assets by tris count and scan for excess. I wonder if they had somebody go through and strategically disable shadowcasting on things like those teeth? I am guessing that they made optimization "everybody's responsibility" but nobody had it as their only responsibility.

gamblor956
2 replies
19h37m

Occlusion culling and LOD should be handled by the engine, not the game logic, so the write-up really points to the problem being Unity's new and very incomplete rendering pipeline for ECS.

vasdae
1 replies
19h30m

Granted I know next to nothing about game development, but aren't LOD models made by hand?

MrLeap
0 replies
19h10m

There are tons of answers to this! I'm going to say that in projects I've worked on, LODs have been hand made about 60% of the time.

There are tools for creating automatic LODs that come with their own pro's and con's. A bad LOD chain can express itself as really obvious pop-in while you're playing the game. There's also these things called imposters that are basically flipbook images of an object from multiple angles that can be used in place of the true 3d geometry at a distance. Those are created automatically. They tend to be like 4 triangles but can eat more vram because of the flipbook sizes.

Unreal engine has nanite, which is a really fancy way to side step needing LOD chains with something akin to tessellation, as I understand it. Tech like that is likely the future, but it is not accurate to describe it as the "way most games are made today"

hellotomyrars
1 replies
19h26m

Yeah I mean regardless of any of Unity’s limitations, this is entirely upon the developer.

However, I also find the suggestion that because there are other high profile examples of unity projects with performance issues, it must be a problem with unity.

You don’t hear that about Unreal Engine, despite the fact that there are poorly optimized UE games.

Such a bizarre set of assumptions.

johnnyanmac
0 replies
7h43m

There is definitely a lot of public bias amongst the two engines. UE4 couldn't solve this problem either (UE5 might, but Nanite isn't quite as "it just works" as you'd expect as of 5.1).

It definiitely has much to do with how UE's PR constantly shows off and ships new and exciting features that blend in the engine. Meanwhile, Unity has been criticized for some 6+ years minimum for its package management and lack of cohesion.

raincole
2 replies
20h51m

I'll be really surprised if City Skylines's team didn't have access to Unity's source code.

kimixa
0 replies
19h55m

And do they have the number of engineers with the required skills to rewrite half the engine? Especially if the reason why they developed using those tools and engine is they expected not to have to do it themselves in the first place?

It's not like there's just some "go_slow=true" constant that just needs changing.

bananaboy
0 replies
17h56m

It's pretty unlikely. A source code license is negotiated with the sales team directly and costs at least USD100k last I heard (the price is not publicly disclosed). They're also reluctant to give source code licenses at all.

LarsDu88
0 replies
12h46m

This is simply not correct. City Skylines 2 even went through the trouble of using DOTs which is something you cannot take advantage of in Unreal Engine or Godot. To get more optimal than that on the CPU utilization side, you will be writing your own engine in C++ or Rust.

The fuck up here is whoever was handling the art assets. You simply do not ship a game with such detailed graphics and no LODs. They must've simply been downloading things off the asset store and throwing them in without any regard for performance.

KronisLV
4 replies
20h21m

Honestly, automatic LOD generation would solve at least some of the performance issues: add the functionality, make it opt-out for those that don't need LODs and enjoy performance improvements in most projects, in addition to some folks getting a simpler workflow (e.g. using auto-generated models instead of having to create your own, which could at the very least have passable quality).

Godot has this: https://docs.godotengine.org/en/stable/tutorials/3d/mesh_lod...

Unreal has this (for static meshes): https://docs.unrealengine.com/5.3/en-US/static-mesh-automati...

Aside from that, agreed: the multiple render pipelines, the multiple UI solutions, the multiple types of programming (ECS vs GameObject) all feel very confusing, especially since the differences between them are pretty major.

mardifoufs
3 replies
19h19m

I'm pretty sure unity already has that.

KronisLV
1 replies
18h45m

Out of the box, it only has manual LOD support for meshes: https://docs.unity3d.com/Manual/importing-lod-meshes.html (where you create the models yourself)

They played around with the idea of automatic LOD, but the repo they had hasn't gotten updated in a while: https://github.com/Unity-Technologies/AutoLOD

The closest to that would be looking at assets on the Asset Store, for example: https://assetstore.unity.com/packages/tools/utilities/poly-f...

An exception to that is something like the terrain, which generates the model on the fly and decreases detail for further away chunks as necessary, but that's pretty much the same with the other engines (except for Godot, which doesn't have a terrain solution built in, but the terrain plugins do have that functionality). I guess in Unity's case you can still get that functionality with bought assets, which won't be an issue for most studios (provided that the assets get updated and aren't a liability in that way), but might be for someone who just wants that functionality for free.

mardifoufs
0 replies
17h27m

Ok that's pretty surprising. I didn't know it was still this bad

wilg
0 replies
18h41m

It doesn’t which is really annoying.

frozenfoxx
3 replies
20h48m

I worked at Unity on Build Automation/Cloud Build for nearly a decade. Let me assure you, that tech debt is NOT being fixed any year soon. It’s due to a fundamental disconnect between executive leadership wanting to run the company like Adobe (explicitly) and every engineer wanting to work like a large scale open source project (Kubernetes, Linux, and Apache are pretty close in style). The only way anything gets built is as a Skunkworks project and you can only do so much without funding and executive support.

robterrell
0 replies
18h4m

Can you elaborate on this?

johnnyanmac
0 replies
8h41m

> I worked at Unity on Build Automation/Cloud Build

my condolances. the Build iterations and ancient CI workflow was by far the biggest complaint for the teams I was on and talked to. it takes so long getting stuff properly landed into trunk that I can't really blame the teams wanting to break off if possible (it fortunately was for my division).

But I'm guessing it was hard to convince product or execs that dev velocity matters, so we were all just wading in muck. I heard things were improving... but I heard that every month I was there.

>executive leadership wanting to run the company like Adobe

I think I know what you mean by this, but can you clarify?

LeanderK
0 replies
19h40m

> run the company like Adobe (explicitly)

what does this mean?

AuryGlenz
3 replies
20h58m

It's honestly a bit insane.

Just the other night I wanted to know what it'd take to do some AR development for the Quest 3 using Unity. 10 minutes in I was straight up confused. There's AR Foundation, AR Core, AR Kit, and I think at least one other thing. I have no idea the difference between those, if they're even wholly separate. That's on top of using either the OpenXR or Unity plugin for the actual headset.

andybak
2 replies
20h51m

AR Kit is Apple's thing. AR Core is Google's thing. Neither of those are Unity's fault. AR Foundation is a Unity layer to present a common interface. Which of my books is a good thing.

Open XR is also an an attempt to make a cross platform layer for vendor specific APIs. Again not Unity's fault. The Unity plugin system is a common interface for all XR devices.

I'd generally support your sentiment but in this case you're picking on things where Unity had mostly got it right.

AuryGlenz
1 replies
13h45m

This comment made things more clear to me than the documentation that I saw, perhaps because I was looking at it from a Quest 3 perspective and their main page about it doesn't mention that at all.

I know what Open XR is all about but again, it's not clear which you should actually use for development if you're only targeting Quest devices, for instance. A little extra documentation would go a long way.

The same goes for all of the other things so frequently mentioned like their renderers.

squeaky-clean
0 replies
11h23m

Honestly, like a lot of Unity functionality, what you do there is pay $50-100 for an asset from the store that handles things in a sane way.

moffkalast
0 replies
21h17m

I think it's a good thing in the long run, one more reason to switch away from Unity to add to the ever growing pile.

dexwiz
0 replies
21h47m

Sounds like every other enterprise software platform. Unity has reach the IBM level of “no one gets fired for choosing X,” even though X only makes the business people happy.

araes
0 replies
19h36m

It sounds like they need to implement easy to use Level of Detail (LOD) and progressive meshes. 100,000 vertices on far away objects will break most rendering pipelines that do not somehow reduce them. 100,000 complicated matrix interactions instead of the like, 8, it probably takes really far away.

[1] https://en.wikipedia.org/wiki/Level_of_detail_(computer_grap...

[2] https://en.wikipedia.org/wiki/Progressive_meshes

vGPU
48 replies
22h9m

> This mesh of a pile of logs is similarly only used in the shadow rendering pass, and features over 100K vertices.

But… why?

clnq
31 replies
21h2m

Because it is one of the 1,000,000 things to pay attention to in game development. Someone or some software probably just made a mistake in setting up its LOD. Or some dynamic LODding code didn't properly cull the LOD0 mesh. Or that code couldn't be finished in time. Or it was something else.

It's completely normal in AAA games to have a few imperfect and in-optimal things. Budgets are always limiting, and development times short. Plus, it's a hit-driven industry where payoff is not guaranteed. There are some things you can do (which are usually management-related and not dev-related) to make the game a success, but estimated bookings are rarely on-point. So trade-offs have to be made to de-risk - corners cut where possible, the most expensive part - development - de-prioritized. These are much bigger trade-offs than a single mesh being unoptimized. A single mesh is nothing.

It's a fun fact that this mesh is LOD0, and so is the teeth mesh. But that alone doesn't tank the performance of the game and is probably unlikely to be addressed in lieu of actual performance fixes. The fixation on these meshes in the thread is kind of excessive.

A lot of these comments are quite galvanized so I don't want to add to that - just giving more context.

mvdtnz
15 replies
19h9m

> It's completely normal in AAA games to have a few imperfect and in-optimal things.

No, mate, stop. The state of C:S2 is well beyond anything we should accept as "completely normal". It's a defective product that should not have been released. Stop normalising this crap.

dingnuts
6 replies
16h11m

Good grief, I have a mid tier AMD card and I'm having a blast with almost 40 hours in the game already. Can we quit with the "defective" propaganda?

The game runs fine and it's really fun. This "controversy" really drives home for me how detached from reality online discourse often is

hwillis
3 replies
15h20m

The most common GPU on steam stats is a 3060. The AMD 7800/7700 do 90% and 70% better on benchmarks. So if you have either of those, you're getting nearly twice the FPS that the most common steam user would see.

beowulfey
1 replies
4h32m

I have a 1660Ti. I get 40+ fps on 1080p. It's functional. Now, whether it is really any better than the previous generation is the more relevant question in my opinion. There are a few things improved, and a whole lot missing. The performance issues are only a small fragment of the game's issues.

red-iron-pine
0 replies
2h34m

yeah this here is the real point.

runs alright for me, too. just alright, but for what it is that's fine.

however it's not really much better than the predecessor, and needs to give me a reason to give up on the HUGE mod community and well documented approaches from CS1.

squeaky-clean
0 replies
11h36m

I'm running on a laptop 2070 and doing just fine. Calling it defective is just blatantly lying.

It should be better optimized, but calling it "defective" does nothing but make people dismiss your comment.

unaindz
0 replies
11h45m

It works on my machine

programcookie
0 replies
7h19m

While I tend to agree that there are a few people bashing it just because everyone else seems to be doing it I would also like to ask you to consider that not all the complaints are invalid.

Case in point: I get 6-10 fps on a 7900XT with the default settings, 17 if I use the lowest preset, all while seeing around 55-ish percent system utilization.

Something is amiss here and the game is definitely not running fine for everyone.

As someone seeing this weird performance issue I wish both sides would focus less on screaming at each other and make it easier to figure out the root cause so everyone can enjoy the game.

Retric
3 replies
18h56m

Their point is that specific mesh could be left alone and the game still be playable as long as other issues were fixed.

Chances are a nearly complete version of C:S2 was playable and they “broke it” at the last minute by not finishing the optimization process.

mvdtnz
2 replies
18h46m

That's speculation based on nothing but vibes.

Retric
1 replies
18h36m

It’s speculation based on these mesh sizes being so arbitrary in the game development process and what’s broken being unnecessarily window dressing for gameplay. It’s the kind of thing that could be delayed to the last minute with some simple placeholder.

“Now you might say that these are just cherry-picked examples, and that modern hardware handles models like these just fine. And you would be broadly correct in that, but the problem is that all of these relatively small costs start to add up, especially in a city builder where one unoptimized model might get rendered a few hundred times in a single frame. Rasterizing tens of thousands of polygons per instance per frame and literally not affecting a single pixel is just wasteful, whether or not the hardware can handle it. The issues are luckily quite easy to fix, both by creating more LOD variants and by improving the culling system. It will take some time though, and it remains to be seen if CO and Paradox want to invest that time, especially if it involves going through most of the game’s assets and fixing them one by one.”

IE: The the game would have looked nearly complete even if none of these meshes where in use. Meanwhile the buildings themselves are optimized.

ethbr1
0 replies
15h14m

Agreed.

This really smacks of late asset delivery, which probably happened because delivery dates to the rest of the dev team kept being bumped.

Then, by the time the assets were finally delivered, it was recognized they weren't optimized (as expected), but there wasn't any time to fix that.

Or the same thing with an internal asset team.

Although honestly, you'd think after seeing the performance numbers they would have implemented an "above X zoom level, people / anything smaller than this are invisible" sledgehammer fix, until LOD could be properly addressed.

Better to deal with pop-in than have everything be unexpectedly slow.

taneq
1 replies
18h15m

What trade off would you choose between fixing a performance issue which is somewhat avoidable and fixing a game breaking crash bug? Because the latter gets priority and there’s never enough time to fix all of those before launch, let alone work your way up to closing out every last frame rate drop.

Aeolun
0 replies
17h11m

I think if you get 100k poly models in your (city builder) game in the first place (for any but the most amazing wonders) your process has failed spectacularly at some point.

johnnyanmac
0 replies
12h43m

you're missing the forest for the badly rendered tree in some E3 showcase that everyone nitpicked to death. Can we not treat this discussion like a Reddit rant, please?

Gamedev has many shortcuts, and some things simply fall through the cracks. Some are caught and fixed, some are caught and not fixed, some just aren't caught at all. I imagine it's the 2nd case here; There's a unfortunate large amount of bgs these days caught by QA but not given time to fix before publisher mandates.

Guthur
0 replies
16h16m

They pay for it they accept it, it's that simple.

All this crying when they could have simply returned the product or not buy it at all. Colossal Order themselves warned about the performance before it was even released. There were plenty of reviews that said the same thing.

So to get up in arms about the performance means they are just being exceptionally stupid and entitled, and they should just grow up and stop crying over their toys.

Colossal Order can release whatever garbage they want to. And you can choose to buy it or not, or even buy it and return it (fight for better return policies if you want something positive).

eloisant
4 replies
19h23m

I get that you can leave a bunch of things unoptimized, as long as it works fine.

What I don't understand is - how did they not notice that the performances was horrible even high end hardware? How did they not decide to take the time to investigate the performances issues and find the causes we're talking about now?

clnq
2 replies
17h21m

So you know the saying “premature optimization is the root of all evil?” Producers love that statement because it removes half of the complaints around work being rushed.

Optimization is not done throughout the process and later there’s not enough time. Assets are made with bad topology and it would take time to redo them. Or it would take time to write a tool that retopologizes them automatically.

What I’m saying is by the time it’s “time” to optimize, there’s not enough time to optimize. It happens very commonly. But the alternative is taking development slower to do things right. And you simply don’t get investment for schedules like that in most companies. Not to mention that it’s goddamn hard to do when the execs lay off people, ask them to RTO, and induce serious attrition otherwise. Sometimes the team just can’t settle into a good process as people leave it too much. So you’re between a rock and a hard place — on the one hand: attrition and low morals, on the other hand: a tight schedule. This doesn’t apply to Colossal Order from my knowledge, but it does apply to many AAAs.

There is a problem at the root of this - extremely over-ambitious production schedules as norm. Most other things are symptoms. Most of what I described is a symptom.

athrow
1 replies
17h7m

Except there really isn’t a better product to back up these issues. There were some important improvements made to the gameplay, traffic system, certain things were reworked, nicer graphics etc. It feels like an iteration and not a ground-breaking game rhat would justify the performance issues we’re seeing.

clnq
0 replies
14h26m

The design was iterated, but the game assets are redone almost completely, and the systems appear largely reworked, too. This is evident when you play the game.

The scope of work done for this game was exceptionally large for a company with 40 employees, assuming it was done within the usual AAA timeframe.

tbillington
0 replies
19h6m

I _guarantee_ they knew about it.

They even posted on social media 1 week before launch warning people to expect lower than expected performance, and raised the system requirements.

If companies have to decide between prioritising features that they've advertised, show stopper bugs, and performance, guess which one always takes the back seat :)

Dylan16807
4 replies
15h5m

> Because it is one of the 1,000,000 things to pay attention to in game development.

Finding objects with ridiculous triangle counts is one of the easiest things to do when you have known performance issues.

If they didn't have time to do the first, easiest chunk of the work, then something was far more dysfunctional than "it's one of a million things to deal with".

jiggawatts
3 replies
12h59m

This is on par with: “Sure, there’s a raging fire in that corner of the office, but it’s just one of a million things I had to deal with that day. That’s why I didn’t call the fire department, okay!?”

Staying on top of poly count budgets is like game dev 101.

johnnyanmac
0 replies
12h40m

Happen more than you think, despite the absurd metaphor. Maybe there was an inferno in the neighbohood and no time to worry about the fire in the corner of the apartment. Maybe your publisher doesn't care if your house burns down but wants to make money now for their own earnings.

I don't think it's uncommon even outside of gaming for modern software to have these sort of "hidden fires". Games just get a lot more conversation and a lot of niche problems as a 3D real time application.

clnq
0 replies
3h58m

No, friend, poly counts are rarely a raging fire these days. Not 10k poly counts. That was in the 90s.

3seashells
0 replies
10h0m

But gamedev usually means, all new faces teams, as the old one quits in lockstep. So it's very likely a bunch of youngsters running around yelling "premature optimization" is the death of all good things. Mistaking that with needing no optimization plan at all until game is almost done.

dimgl
2 replies
18h12m

> Because it is one of the 1,000,000 things to pay attention to in game development.

This is a cop-out. This doesn't seem like an oversight but rather blatant incompetence. You don't just "not pay attention" to this.

johnnyanmac
1 replies
12h38m

you call it incompetence, devs call it "publishers told us to ship now". You'd think a technical community like this would sympathize with such mandates given across the industry.

mook
0 replies
8h48m

Call it incompetence on the part of the management, then?

smolder
0 replies
19h20m

You're right that this kind of stuff is sort of par for the course. As in other cases, it's indicative of (IMO) a bad development process that they didn't budget the time to polish before shipping. I save my games budget for stuff that is "done when it's done", not rushed out, mostly out of principle.

If you aggressively min-max development cost & time vs features, there are big external costs in terms of waste (poorly performing software carries an energy and hardware cost,) end-user frustration, stress on workers, etc., which is how I justify voting with my money against such things.

account42
0 replies
1h7m

> It's a fun fact that this mesh is LOD0, and so is the teeth mesh. But that alone doesn't tank the performance of the game and is probably unlikely to be addressed in lieu of actual performance fixes. The fixation on these meshes in the thread is kind of excessive.

So I assume you have a better explanation of the excessively slow G-buffer and shadowmap passes?

ripper1138
9 replies
20h39m

The studio that made this has like 30 devs.

harrid
6 replies
20h32m

This doesn't fly with a one man team and not with a 1000. It's just badly done, there's no sugarcoating. Those meshes should never end up in the game files.

johnnyanmac
4 replies
12h34m

>This doesn't fly with a one man team and not with a 1000

sounds like someone never worked on a 1000 dev team. random quirks either go unnoticed or are de-prioritized all the time. Most are minor, more and more moderate to major ones are getting through. That's definitely a publisher issue.

kamray23
3 replies
8h7m

random quirks do. this is not random quirks. it's a systematic and expected issue of underoptimisation caused by releasing a product before it was even slated to be ready. one bad mesh is not the issue, it's never the issue. we're talking of thousands of terrible meshes and a near-total lack of basic optimisations applied at the last stage of development to most games. the manpower was not enough to release within the deadline, likely due to running into a lot of technical difficulties working with unity. instead of going into valve time, they released it anyway, which means that you skipped the entire polish and optimisation part not only for the game itself but for half the engine as well. poor performance was not only expected, i'm certain that every member of the team saw it as the only possible outcome.

johnnyanmac
2 replies
7h50m

I'd call 4 examples of "this needed LODs" random quirks in the grand scheme of things. It's not like every single mesh is 100k vertices. Grossly underoptimized, yes. But the devs pre-empting their announcement with "we're not satisfied" tells me they were too busy slaying dragons to worry about the annoying barking Chihuahua in the room.

It was expected, yes. It does not mean they weren't trying to fix it in the 11th hour. I woildnt be surprised if some core tech was unfinished or inadequate that lead to this.

>instead of going into valve time, they released it anyway, which means that you skipped the entire polish and optimisation part not only for the game itself but for half the engine as well.

Yup, welcome to game development when you have deadlines and no benevolent (or at least, apathetic) dictator paying your bills. It's unfortunate that we can trace this back to the 80's with ET, but this is simply the business realities. Game code isn't mission critical (and until recently, does not care about maintainability), and also isn't what sells the product.

So it never gets the time to be cultivated like other indistries. And people still buy anyway. It's a two way street of apathy and every publisher hopes it can slip under the cracks and not become the next Superman 64. Most manage to slip.

There's not much you can do about it with the current publishing structure, where most funders don't work in nor care about games. And the ones that do still see their money draining whenever the talk of delays come up. That won't be solved except with time as more industry trailblazers retire and shift to management (remember, Todd Howard is only in his 50's. Gabe and Kojima are 60. So many pioneers are still well under retirement age). Or for more truly indie teams to arise and learn how to scale up projects while staying lean. The latter is what I hope to do.

kamray23
1 replies
7h9m

It's pretty clear that with the performance that the game has it's a lot more than four models with bad LODs. They would probably have identified that issue within days. There's likely a more fundamental issue, not only with the technology but also with LODs for most detail models. These are only examples since you can't list every model out of hundreds being rendered.

I really think that the performance and scale indie developers can squeeze out puts AAA developers to shame nowadays, and I really hope that that'll continue to happen. All it takes is time, organisation, and a lot of time.

johnnyanmac
0 replies
7h0m

Sure, more than 4, less than however many assets there are in the game. I mostly want to emphasize that a systematic issue implies that this was an acceptable development pipeline, which I doubt any engineer on the team would say.

>I really think that the performance and scale indie developers can squeeze out puts AAA developers to shame nowadays, and I really hope that that'll continue to happen. All it takes is time, organisation, and a lot of time.

Yup, I agree. Indies don't tend to have money, so the budget comes from. Elsewhere. We can already utilize some amazing tools to cut down the time of scaling up environments. Not as much with actor assets. But I don't think it's too far off (more just locked off in acedemics white papers).

dimgl
0 replies
18h11m

Thank you! This is what I was getting at in another comment. This isn't just a case of "oh no, I forgot to switch a button".

LarsDu88
1 replies
12h49m

I'm solodeveloping a game, and there's no fucking way a 100,000 vert mesh is getting in the game without a LOD. My game is running on the Quest 2 at 72 fps stable

johnnyanmac
0 replies
12h33m

sure, it's easy to catch major hiccups when one person has 100% knowledge base. Not so much when 30 devs each have different responsibilities and the optimizer guy is drowining in other fires.

jameshart
3 replies
18h19m

Well, because when you’re modeling a pile of logs, each additional log you add doubles the number of vertices.

This is a well known property of log scaling.

function_seven
1 replies
17h16m

I’m too dumb to know if you’re making a math joke or if this is a real 3D modeling thing.

Or both?

Aeolun
0 replies
17h7m

It’s a joke

amluto
0 replies
1h3m

If you make natural looking logs, it’s a factor of 2.7 or so.

matsemann
0 replies
20h58m

It could have been like a hundred vertices and a clever normal map. Just insane.

Tijdreiziger
0 replies
21h20m

Hey, ya gotta have logs /s

kossTKR
39 replies
22h41m

This is everything wrong with both (sweatshop) games and programming!

This is 180 of how people programmed for the older consoles in the most fun and creative ways to squeeze the most out of smaller hardware.

Take a look at this for comparison:

https://arstechnica.com/gaming/2021/09/war-stories-how-crash...

Reminds me of the state of frontend development where you need 8000 tools and dependencies to show even the simplest of things, and i'm not surprised they bundle React for the menus.

I absolutely hate this way of doing things, because for me all art, all engineering, all creativity is about boundaries, dogmas, and squeezing and optimising the hell out of your _elegant_ systems.

I mean even in my small webgl/threejs projects the fun part was getting every last bit of eye candy out of the smallest file sizes i could, simplifying geometry, lowering resolution, while maintaining great looks.

100k vertice log piles and hundreds of people with teeth?

Optimisations like this aren't even hard or time consuming (and they are fun) - can anyone clue me in on why you ship your stuff in this state - what happens in a studio like this? Is it 100% shitty work conditions? How could single devs and small studios create relatively large games with love 20 years ago for small money?

ajmurmann
14 replies
22h32m

> Optimisations like this aren't even hard or time consuming (and they are fun) - can anyone clue me in on why you ship your stuff in this state - what happens in a studio like this?

Business and survival is what happened. Read any of Jason Schreier's books on game development. Income is very chunky with games being in development for years. Postponing by a few months might sink your company, especially with interest rates being high.

Calling out that something is "easy" and "fun" is likely insulting to the developers who are frequently working 70-80hour weeks sand ruining their family life in the process.

Pannoniae
11 replies
22h28m

They are wasting all that effort in the wrong things though. I fully understand the problem of crunch - but these poor devs wouldn't have to crunch as much if the game's budget wasn't wasted on "analytics" and useless features, instead of polishing the core gameplay, then adding the fluff after launch.

In this game, they've focused on the superficial stuff, yet the core gameplay is still broken (instead of a city builder where citizens have agency, the game is effectively a god simulator where your biggest challenge is traffic management, economy or politics is a joke)

This is why many people prefer older games - the amount of effort spent on the gameplay itself is only decreasing year by year, while most of the budget is spent on useless graphical effects, quirky things everyone forgets in two months and all the usual "analytics"/"cloud" stuff.

habinero
8 replies
22h10m

Nobody sets out to make a bad game or "waste effort" on "useless" features.

Things like this happen because the people giving you money have a hard deadline and you ship what you have.

Or you decide to use a game engine that was more difficult to use than expected.

It's insulting to say "well, just make the game better first, duh". I promise you, they know.

But they have to balance a lot of things you don't see.

If you ever find yourself saying "why don't they do [obvious thing]", stop and assume you don't have all the facts.

Pannoniae
7 replies
22h5m

I would believe this.... if 1. games made 15 or 20 years ago 2. and indie games made today would not be able to manage it.

It's always the bigger studios who utterly mess up in making an actually playable game, which indicates that the problem is not something inherent but a simple product of laziness and greed. (Latest example: see Creative Assembly's meltdown)

ajmurmann
4 replies
21h56m

As scope increases all the organizational challenges balloon. Coordinating 10 people is much easier than several hundred. It's the same almost regardless of domain. What happens if your core game loop still is no fun, but you got 100 people rolling off a previous project and ready for the next phase of this new project to get to where you need them to start design levels and assets? It's much easier to fix if the additional 3 month of development is just 3 months cost of living for John Romero and John Carmack.

Pannoniae
3 replies
21h42m

it's not like game studios have increased in size, it's just more bloat. look at old games' credits, you'll find maybe even bigger studios (making art and programming with limited hardware was much more time-consuming....) but they had good gameplay on a much smaller budget. Today, studios waste money on analytics, governance things and other fluff...

Best case study is Mojang, the company has over 800 employees but it is literally outperformed in game design and update quality/quantity by ten people at Re-Logic. (which includes managers and legal as well!)

gamblor956
1 replies
20h3m

Terraria started off as a 2D homage to Minecraft...The very first release basically was just a 2D version of Minecraft. It took a few updates for Terraria to become its own thing.

I enjoy both games, especially Terraria, which I have played since its original release. But let's not lie to ourselves that the volume of content updates for Terraria is anywhere close to the updates that Minecraft has received. Adding content for a 2D game is a lot easier than adding content for a 3D game, even if you're using voxels.

ajmurmann
0 replies
19h46m

IMO, the parent picked a terrible example. Comparing any game to "Minecraft" doesn't make much sense to me. What even is Minecraft at this point? There seem to be a multiple versions on a multitude of platforms, some on the same platform targeting different demographics. Different changes are need to keep the different demographics hooked. Of course the original was built by a single guy which avoided all the organizational complexities.

ajmurmann
0 replies
21h31m

> it's not like game studios have increased in size, it's just more bloat

Are you serious? Teams have increased massively. Super Mario Kart for example had less than 20 people working on it. That's not even the size of the audio department for many modern AAA games

waveBidder
0 replies
21h42m

The indies that don't make a viable product, you don't see.

habinero
0 replies
17m

That's just bias on your part.

You don't see the thousands of indie games that were never released because the creators screwed it up, or they ran out of money.

Or the indie studios who never made a second game because they didn't recoup their costs.

Indies that fail just fold and you never hear about them. AAA that fail release their games.

ajmurmann
1 replies
22h18m

None of that is the engineers' fault. All that comes from the game directors and business people.

stylepoints
0 replies
16h8m

And for very good reasons. A huge fraction of players buy the game based on how it looks.

vGPU
1 replies
22h3m

Sure, but this is Paradox, a company that constantly makes buckets of money on DLC for Stellaris, Europa universalis, crusader kings, etc. I doubt they were about to run out of money. They just had to release a new race of scantily dressed aliens for stellaris and they’d be good for another half a year.

meepmorp
0 replies
21h0m

Paradox is just the publisher, it's developed by Colossal Order.

QuickMinuteOats
10 replies
22h33m

Pardon the snark, but the answer should be obvious: budget and deadlines.

Producing well-optimized, “clever” code usually requires magnitudes of more time than the simplest, quickest solution. Hobbyists line yourself, and a few companies like Nintendo, are typically the only ones that can afford to spend time optimizing like you describe.

imiric
4 replies
22h5m

I don't buy that excuse. To me it seems more like game developers have become complacent with the powerful hardware consumers have at their disposal, especially on PCs, and the fact they can always push fixes after the release. Decades ago it used to be a major milestone when a game "went gold". It meant QA was successful and that the game was fully playable. Budget and deadlines also existed back then, but there was (usually) much more care taken to ensure a good gaming experience, regardless of the hardware. Saying that only a few companies can do this successfully today is excusing objectively bad development practices.

Consumers should vote with their wallet, and stop preordering and falling for preorder bonuses and marketing hype, which is another disease affecting modern gaming. Unfortunately, publishers know that they can release a lackluster product based on hype alone (No Man's Sky, Cyberpunk 2077), and then spend years "polishing" a game into a state they promised before the initial release. The modern gaming industry is rife with scams like these to the point that it should be heavily regulated. So, no, none of these companies can be excused for releasing a garbage product and charging full price for it.

clnq
3 replies
21h12m

> I don't buy that excuse.

This means you are, sadly, very uninformed. There is a lot of rolling with the punches in the games industry, and many engineers want to optimize things more, and work OT to do so (as there is an extreme shortage of time to do this in AAA space on company time).

imiric
2 replies
20h51m

> This means you are, sadly, very uninformed.

No, it means that I don't accept budget and deadlines being an excuse for delivering a poor experience. As a consumer, I'm speculating about the reasons why this happens, but my point is that it shouldn't happen at all.

> many engineers want to optimize things more, and work OT to do so (as there is an extreme shortage of time to do this in AAA space on company time)

Again, this is an industry problem, and not something companies should be excused for.

Whether engineers actually care about optimizing or not, and whether they crunch or not (as much as I may sympathize), is not my concern, and I place equal blame on them for delivering a subpar product, whether it's under their control or not. Ultimately their names will be listed in the credits, and they represent the product as much as the publisher. If they don't like the environment of a particular studio, they can always choose to work elsewhere.

clnq
1 replies
20h43m

> I place equal blame on them for delivering a subpar product, whether it's under their control or not.

What else is there to say...

> If they don't like the environment of a particular studio, they can always choose to work elsewhere.

They like it. The industry just has issues beyond their control which are in the process of being solved, gradually. No one will drop their dream job to satisfy your entitlement right now, sorry to say. You are free to not buy the game.

imiric
0 replies
20h32m

I'm entitled because I want to buy a product that works as advertised?

> You are free to not buy the game.

Yes, I'll continue to do so. I just wish other consumers did the same so that this situation can improve. The first step is not excusing it when it happens, but condemning it.

kossTKR
3 replies
22h30m

I get that to an extent, but it would literally take at max a day for a person to run thousands of meshes through an acceptable SimpifyGeometry function or sorting all models by vertice size and removing the most idiotic ones, or remove the teeth in one go.

ajmurmann
2 replies
22h13m

Sure. Who is gonna prioritize that? Engineers likely have no discretionary time left and are even working weekends. Even on a web project I've been on, I had PM complaint every meeting about loading times of a admin interface. I told them every time that they should than prioritize the pagination ticket I had written. After a months of this shit, I took time out of my Saturday and just added it. I wouldn't have done that if I had already had to work nights and weekends.

kossTKR
0 replies
14h38m

Not blaming the engineers! The problem is structural or from managers and higher ups.

This is why most non technical business school types can fuck right off, i’ve rarely seen them make anything better.

Managers with either dev or design experience is the only thing that works.

jokethrowaway
0 replies
21h58m

Working overtime is just insane to me. Why can't you just ignore the PM and do it during normal hours?

What are they going to do? Fire you?

BaculumMeumEst
0 replies
22h2m

Really? Shipping logs with 100k vertices and people with individually rendered teeth is an "obvious budget and deadlines" issue?

It strikes me as a "how many idiots are in a position to make important decisions on this game" issue, or "how generally competent is the development team" issue.

xyzzy_plugh
8 replies
22h37m

I typed something very similar in parallel with you. Guess the engineering typically required is no more? At least not here.

mschuster91
7 replies
22h33m

Pay peanuts get monkeys. Game dev has been infamous for taking in young, fresh college graduates, promise them "credits" and "fun life" and then run them through the grinder for shit pay. And eventually, even those who survived the grinder and ended up living long enough to become seniors burn out, and that's how you get this kind of clusterfuck in the end.

Game dev seriously needs to follow the VFX industry and unionize. I have zero trust left in fellow gamers to not buy games from unethical producers.

ajmurmann
5 replies
22h30m

Make people work 12 hour days to ship before you go out of business and corners will need to get cut. Insulting engineers and describing them as "monkeys" because you are unaware of businesses function is quite unwarranted. "Real engineers" need to take a real look at themselves!

mschuster91
4 replies
22h28m

"Pay peanuts get monkeys" is a proverb.

> Make people work 12 hour days to ship before you go out of business and corners will need to get cut.

Won't happen. The US barely has any employment laws, and so do many other countries of the world.

ajmurmann
2 replies
22h17m

What won't happen?

mschuster91
1 replies
21h36m

As long as the government doesn't ban employing people for 12 hours and more straight for weeks, and actually enforces that ban, there will always be enough employers doing so, and enough people willing to go through with it "for the credits".

ajmurmann
0 replies
21h16m

Especially in am industry that sees depressed wages because it's the dream job for many.

I'm still not sure what from my original statement won't happen.

anticensor
0 replies
11h20m

Colossal Order is in Finland and Paradox Interactive is in Sweden, both of which have strong employment laws.

chc
0 replies
22h27m

Game dev in general is that way, but my impression was that Colossal Order had traditionally been a little better than most. I suppose I may have been mistaken.

capableweb
1 replies
22h22m

> Reminds me of the state of frontend development where you need 8000 tools and dependencies to show even the simplest of things, and i'm not surprised they bundle React for the menus.

Correction: React/Web technlogy is responsible for all the UI, from the loading screens to in-game labels when using road tools and everything in-between.

And their implementation of Coherent Gameface is not the reason for the performance issues in the game, so not sure how it's even relevant.

jokethrowaway
0 replies
22h2m

I think parent meant to imply there is a similarly wasteful culture in frontend development so of course they'd use react for menus and labels.

lozenge
0 replies
20h58m

Most of the revenue comes in after, long after release. So it now makes sense to release an unfinished game and use the revenue to pay for the improvements.

Dalewyn
0 replies
22h34m
thrillgore
25 replies
20h45m

"And the reason why the game has its own culling implementation instead of using Unity’s built in solution because Colossal Order had to implement quite a lot of the graphics side themselves because Unity’s integration between DOTS and HDRP is still very much a work in progress and arguably unsuitable for most actual games."

This sadly tracks with my own experiences with Unity's tooling, where DOTS did ship but its implementation rots on the vine like every other tool they acquired. The company is woefully mismanaged, its been mismanaged, and given the very public pricing incident from a few weeks back, they aren't focusing on improvements to the engine, but on any way to scrap money from its users.

Bevy's ECS implementation is really good, and I want to see it succeed here, in addition to Godex.

Sakos
18 replies
17h33m

I don't understand how Unity burns through up to a billion in revenue every single year, yet their engine still feels so half-baked and unpolished. Where's all that money going?

djbusby
10 replies
16h57m

Lots of cash can be burned in meetings and by managers of managers. I'm sure my case is not unique here but, meetings that cost $100k are common at $BigCo. And there dozens of those per day.

paulddraper
9 replies
16h24m

You have 5-hour long meetings with 40 people each costing $500/hr?

EwanG
4 replies
16h11m

Let me introduce you to the world of PI planning where two days of every quarter are spent with approximately 100 people (developers, program managers, etc) for each line of business to plan out the next quarter of work - even if 90% of it is continuing the work from the last quarter...

azemetre
2 replies
15h30m

Two days sounds lucky. I've been in SAFE plans where it took an entire week to plan things, it was unreal. Like you I met people I've never seen or heard of from the company all giving their input on software I'm creating. I'd never see them again until the next SAFE planning, only this time it felt like 1 SWE out of maybe 15 (yes there were only 3 teams of 5 devs, put 100+ in these SAFE meetings) contributed anything meaningful.

Company wasted so much money, then the org shut down for spending $500 per customer per year in maintenance (this was a health insurance company) whereas the main company would only spend $70 per customer per year. You'd think leadership of the org would get fired for this but they were rewarded with other positions within the company to do the same thing.

Unreal. Why do shareholder's put up with this? I guess the healthcare monopoly is the only reason to.

tter3
0 replies
14h57m

Half a day is spent on collecting and analyzing the confidence vote. The results of the confidence vote have no impact on the plan except as a baseline for the next PI.

KingMob
0 replies
9h10m

Ugh, I still shudder at our weeklong SAFe planning. At least I got a couple free trips to Europe for the meetings.

edwardsdl
0 replies
15h29m

… and inevitably three weeks later priorities change and it all goes out the window.

spenczar5
1 replies
11h36m

AWS has (or had) a weekly 2-hour operational status meeting. It has over 200 people in it, a mix of distinguished and principal engineers, very senior managers, etc. That meeting easily crossed the $100k-per-pop threshold. Worth it, though, that was actually a pretty great institution.

paulddraper
0 replies
3h6m

> 200 people

> 2 hours

> weekly

Kill me

---

EDIT: also, 200 people for 2 hours is only 100k if average salary is $500k/year

djbusby
0 replies
15h51m

Sometimes the meet has folk that are even more expensive (not me). Sometimes theres more than 50. Four hours of meetings, six hours of wall-clock

Edit: some companies have >20k employees. My first years at MS (late 90s) had these "war room" like 2x a week, 50 people, multiple VPs in the room, 2h. But there were other groups doing their war rooms too.

Tangurena2
0 replies
2h47m

I was involved in a large software project with a large financial institution. Every weekday, there was an hour long status meeting. Based on what my company billed my time at (and presuming the other 40 companies billed their staff time likewise), that morning status call cost about $50k.

intelVISA
4 replies
16h56m

Agile shaman took the wheel

psunavy03
3 replies
16h16m

Agile done properly is literally the opposite of this. Big Important Exec spouting Agile terms they don't understand and cluelessly forcing top-down crap on the teams, more likely. Normalize your story points, prole! Daddy needs metrics!

torginus
1 replies
7h37m

Personally, I've never seen 'Agile' done well. Every project where we made good progress was engineer driven. We talked to customers about what they wanted so see and we created tasks and implemented them.

Every project managed by a dedicated servant-leader Six Sigma Kanban certified Scrum ninja-coach project has been a bureaucratic shitshow, with constant cargo culting to agile priniciples and forced ceremonies.

andreasmetsala
0 replies
7h14m

> Personally, I've never seen 'Agile' done well. Every project where we made good progress was engineer driven. We talked to customers about what they wanted so see and we created tasks and implemented them.

> Every project managed by a dedicated servant-leader Six Sigma Kanban certified Scrum ninja-coach project has been a bureaucratic shitshow, with constant cargo culting to agile priniciples and forced ceremonies.

The engineer-led project sounds more agile than the latter Agile insanity.

red-iron-pine
0 replies
2h41m

> Agile done properly

No True Scotsmen would do programming this way!

animal531
1 replies
6h49m

Their worst problem (compared to Epic) is that they don't finish projects.

They'll want a new feature, for example new scriptable graphics pipelines which will allow them to modify the core rendering a lot easier. Ok that's fine, but then instead of implementing one they try to implement two, while at the same time completely ignoring the existing pipeline which everyone is currently using. The time to get the new pipelines working properly and onboard everyone is counted in years in this case.

Those features they actually "finished", but there are so many experimental packages that just sit there forever, or work in a half-baked manner.

DonHopkins
0 replies
2h8m

They hired the TextMesh Pro dude Zoltran, what must have been six or so years ago, with the promise that he'd integrate TextMeshPro into the "new" UI system, which I was looking forward to. I was just recently reading Unity's web page about what's new in the UI system, which announced that TextMesh Pro is finally built in! That took a while.

https://unity.com/features/ui-toolkit

I really love TextMesh Pro and Zoltran's mesmerizing tutorials, I learned a lot about Unity reading his code, and he was totally into what he was doing and hyper productive when he was a highly regarded indie developer, but I got the impression (just guessing, no inside knowledge) that he was wasn't 1/100th as productive working inside Unity as he was on his own, or maybe they put him on something else than he originally went there to do.

It's been my experience that user interface toolkits are often turf war shit shows. I hope he's doing ok there!

https://www.youtube.com/@Zolran/videos

i.e. the classic 9-year-old Unite 14 demo:

https://www.youtube.com/watch?v=q3ROZmdu65o

torginus
2 replies
7h18m

Tbh, there are many ways to skin a cat, using ECS is not necessary to ship a performant game, not is it necessarily the modern thing to do. It's one of the many architectural patterns you can build a game on top of, and has many advantages as well as detractors.

For example, Godot isn't really build around ECS, but the idea of servers which are mostly autonomous game subsystems that can process their area of expertise (rendering, physics etc.) mostly independently, and are loosely coupled to the general game logic.

ECS architectures originate in the PS2/PS3 era, when CPUs were awful. Tiny caches, horrible branch mispredict penalties, slow memory, fragmented memory spaces and lack of a random access storage forced developers to build their games around predictable memory access patterns, that in general resulted in streaming architectures, where data necessary for the game data to be streamed in tiny chunks to be processed.

While generally this is good practice even nowadays, with the advent of superfast CPUs with amazing speculative execution, great branch predictors, and tens of megabytes of cache, this is no longer strictly necessary, especially considering that most modern games haven't really increased that much in terms of stuff going on on the screen compared to say, a decade or two ago. It's still uncommon for the player to fight more than a dozen dudes in an action game.

And in games with thousands of things on screen at the same time, often specialist logic and handling is necessary.

nvm0n2
1 replies
2h33m

Sure, but surely a city simulation is the perfect textbook use case for a cache-friendly architecture like ECS?

torginus
0 replies
1h28m

Well, ECS only makes some operations cache friendly/parallelizable - namely linear iteration through the array of entities, where not all fields of each entity are used.

I'd imagine a city simulation involves a lot of graph lookup and traversal, spatial lookups and is generally a hairy and messy affair, not sure how easy it is to adapt that to ECS.

Then again, SimCity and Transport Tycoon simulated complex cities on, by today's standards, very frugal hardware, so city simulation might not even be such a CPU hog.

wilg
1 replies
18h39m

DOTS is homegrown isn’t it?

thrillgore
0 replies
17h3m

Right, it is. I intended to talk about Bolt and ProBuilder, tools they bought, added to the engine, and then left to rot.

Although to build DOTS they did poach a lot of ECS and Data-Oriented folks like Mike Acton, who left earlier this year.

johnnyanmac
0 replies
13h21m

>Bevy's ECS implementation is really good, and I want to see it succeed here, in addition to Godex.

Godex is ultimately putting lipstick on a pig. it can improve performance a bit, but ECS isn't some magical optimazation to slap on as a plug-in. cache coherency in the gameplay layer can't fix engine level bottlenecks.

jsyang00
22 replies
22h13m

Perf issues also basically killed the SimCity franchise on PC. Hope they are able to fix things up

capableweb
7 replies
21h45m

That's a huge misrepresentation of what happened to SimCity. EA released a incredibly user-hostile version of SimCity (always online, microtransations and more) that almost no one liked, and Cities: Skylines was released around that time too.

tinco
3 replies
21h11m

I bought both when they came out, and the user hostile stuff didn't bother me at all. What killed sim city was most definitely the performance issues. Unless they had a better reason for restricting the maximum city size.

And then the fact that skylines had both a larger play area and more fancy city building features was just the killing blow.

EA got caught out, thinking they could leisurely bring out an inferior product, when a competitor emerged guns blazing.

notatoad
0 replies
21h2m

Yeah, the user-hostile DRM stuff was really just the icing on the cake. people regularly tolerate all that same stuff when the game is actually good. but when the game is shit, it makes it really easy to take a pricipled stance that you're boycotting the game because of DRM.

lmkg
0 replies
13h15m

> Unless they had a better reason for restricting the maximum city size.

They did! EA wanted you to play the game online. They encouraged you to connect your city with other players by making it difficult for cities to be self-sufficient.

I.e., user-hostile design.

dragonwriter
0 replies
19h49m

> I bought both when they came out, and the user hostile stuff didn't bother me at all.

“You" and “users generally” are different things.

mjrpes
1 replies
18h24m

The main reason I never purchased was the tiny map size. You could barely fit a neighborhood: https://www.reddit.com/r/gaming/comments/19nz93/sim_city_5_2...

Sakos
0 replies
17h27m

Was that a performance limitation or a conscious design choice because of their "cloud" feature?

brnt
0 replies
17h56m

Cities were absolutely tiny in Simcity 2013, they were barely neighbourhoods.

nfriedly
3 replies
20h20m

I thought it was DRM that killed SimCity (?)

DonHopkins
2 replies
19h27m

And the name of the DRM was Origin. It was all about some EA executive deciding to force Origin down everyone's throat, and using SimCity as the Astroglide.

squeaky-clean
1 replies
11h13m

It wasn't origin. It's because SimCity 2013 required you to always be actively online in order to play it. Origin doesn't enforce that.

DonHopkins
0 replies
10h19m

No, I have been working with SimCity since 1991, when I developed and in 1993 distributed online an actual multiplayer version of SimCity for X11 myself (it even had "node locked" and "floating" DRM, but worked fine offline or on a local private network, like when I showed at the ACM InterCHI '93 conference in Amsterdam, but of course I removed the DRM for the open source educational OLPC and "Micropolis" web based versions in 2008). So I know and worked with the people involved at Maxis and EA, and they told me the inside story of what actually happened at the time.

Multi Player SimCity for X11 is now available from DUX Software!

http://www.art.net/~hopkins/Don/simcity/simcity-announcement...

SimCityNet: a Cooperative Multi User City Simulation at InterCHI '93:

http://www.art.net/~hopkins/Don/simcity/simcitynet.html

SimCityNet on Sun Workstation:

http://www.art.net/~hopkins/Don/simcity/SimCity-Sun.gif

SimCityNet on SGI Workstation:

http://www.art.net/~hopkins/Don/simcity/SimCity-Indigo.gif

SimCityNet on NCD X Terminal:

http://www.art.net/~hopkins/Don/simcity/SimCity-NCD.gif

Multi Player SimCityNet for X11 on Linux:

https://www.youtube.com/watch?v=_fVl4dGwUrA

Open Sourcing SimCity, by Chaim Gingold:

https://donhopkins.medium.com/open-sourcing-simcity-58470a27...

Micropolis Online (SimCity) Web Demo:

https://www.youtube.com/watch?v=8snnqQSI0GE

It was all about EA wanting to roll out Origin as their online distribution channel like Steam, and Origin wanting SimCity to be the showcase product that they used to force people to install the Origin downloader in order to play SimCity, and the stubborn insistence that SimCity be "online only" as DRM, and refusal to admit it was a mistake and fix it in response to user demands, all came 100% from Origin, and was forced on Maxis against their will and better judgement.

Then Maxis senior VP Lucy Bradshaw had to take the public flack for that, fall on Origin's sword, and make the false announcement that SimCity wouldn't work offline, even though she opposed it and knew it was bullshit.

And the server meltdowns were all Origin's fault, because they simply didn't have their shit together.

Lucy's a good person who was put in a shitty position by EA management, and she and Maxis and Emeryville Studios and SimCity players and the franchise itself all got screwed and suffered because Origin fucked up, and insisted Lucy and Maxis publically take the responsibility and consequences for Origin's idiotic and stubborn mistakes.

So I wouldn't be shocked and surprised if Cities: Skylines 2 problems with production and shipping before it was ready had more to do with Paradox than Colossal Order.

SimCity Reboot (2012–2014):

https://en.wikipedia.org/wiki/SimCity#Reboot_(2012%E2%80%932...

>It has been suggested that the poor performance of SimCity was responsible for the 2015 closure of Maxis' Emeryville studios, and the end of the franchise.[44][45]

Lucy Bradshaw:

https://en.wikipedia.org/wiki/Lucy_Bradshaw_(game_developer)

>Bradshaw became senior vice president of Maxis in 2013, after serving as the studio's general manager.[5] Bradshaw oversaw development of SimCity, The Sims, and Spore.[6][7] She encountered controversy due to technical issues with the 2013 reboot of SimCity.[8][9]

Maxis explains what went wrong with SimCity and what the developer is doing to fix it:

https://www.polygon.com/2013/3/9/4081464/simcity-interview-e...

SimCity general manager Lucy Bradshaw on why the game 'is not an offline experience':

https://www.polygon.com/2013/3/15/4109480/simcity-general-ma...

Gridlock Plagues the New Online-Only SimCity:

https://www.nytimes.com/2013/03/09/arts/video-games/simcity-...

asylteltine
3 replies
21h37m

Drm killed simcity

vkou
2 replies
18h18m

If SimCity was good, gamers would have held their nose, and played it, DRM or not, just like they did with a million other DRM-heavy AAA titles.

It wasn't.

kristofferR
1 replies
15h18m

SimCity wasn't just "DRM heavy", it was totally unplayable for the first week after the launch because a ton of people were trying to play it at the same time and being stopped by its overloaded DRM servers.

https://www.wired.com/2013/03/simcity-outage/

vkou
0 replies
13h49m

Other AAA titles have also had horrible, show-stopping launches. If the core game is good, the players will tolerate a lot of bullshit.

jcranmer
2 replies
13h39m

That's not entirely true.

SimCity 4 was the last of statistical city simulators. That is to say, if you put down something like a distressed Ong Condos, the game would go "okay, here's 20,778 people in this node in the transportation network, do some network flow and figure out where they work." Newer titles, including the ill-fated SimCity (2013), are agent-based, where they essentially drop in a person and tell them to figure out where they can go or something like that.

As for why SimCity (2013) was a series-ending failure, it was essentially that at every single juncture, EA choose the stupidest possible option, the one that would most guarantee the failure of the series.

The first failure was in the engine itself. AIUI, the developers who came up with the modeling engine knew it wouldn't scale, and never intended for it to actually power a full-fledged SimCity mainline title. EA decided to tell them that it was going to be that anyways. The simulation just didn't scale; the desultory size of the cities was meant to prevent people from attempting to build anything that would cause the simulation to keel over and die--although it's also clear that the max size limit still was too large for the simulation. This is probably where you're getting your idea of perf issues from.

The most famous issue, though, is EA's requirement that the game have multiplayer features to justify being always online (and thus having more intrusive anti-piracy checks). SimCity doesn't lend itself well to multiplayer functionality, and many people correctly assumed that it was a gimmick to justify anti-piracy. Bizarrely, EA tried to claim for a while that one advantage of always-online was to be able to use more powerful servers to do the simulation calculations, but this was never implemented, and the lie was discovered extremely rapidly.

But the single stupidest decision was that EA never put in place enough server capacity to handle the launch. The game would be EA's first always-online flagship release, which meant it would strain their capacity like nothing else they had released ever did. The lack of capacity was repeatedly highlighted as a potential issue in prerelease, and yet was repeatedly ignored. Problems were reported during beta testing and the press preview period, and were still ignored. And launch day came, people experienced several issues, and it still EA like a few weeks to begin to address the issue.

Oh, and you can't forget all the other regular video game release issues like games being released because they need a release date rather than because they're done and things like that.

DonHopkins
1 replies
9h11m

>SimCity doesn't lend itself well to multiplayer functionality

You make a lot of great and accurate points (including blaming EA instead of Maxis). But I disagree that SimCity doesn't lend itself well to multiplayer functionality. (And perhaps EA isn't the right company to pull it off properly, being more focused on competition and violence than collaboration and education.)

I agree that it would be difficult to adapt SimCity to competitive play without over-complexifying and spoiling it, but SimCity is ideal for collaborative play, and constructionist education (as envisioned by Seymour Papert and Alan Kay)!

I designed and implemented a collaborative multiplayer version called SimCityNet that I released online in 1993 (with node locked and floating DRM, but that worked offline or on a private network), including text chat and shared whiteboard overlay and voting dialogs and pie menus, which I described in my other post:

https://news.ycombinator.com/item?id=38155984

Multi Player SimCity for X11 is now available from DUX Software!

http://www.art.net/~hopkins/Don/simcity/simcity-announcement...

SimCityNet: a Cooperative Multi User City Simulation at InterCHI '93:

http://www.art.net/~hopkins/Don/simcity/simcitynet.html

Multi Player SimCityNet for X11 on Linux:

https://www.youtube.com/watch?v=_fVl4dGwUrA

Back when ActiveX was a thing, Mike Perry at Maxis produced a web browser plug-in version of SimCity that was simply integrated with a text chat window, so although you were playing your own local copy of SimCity, you could chat with other users playing their own cities in their own browsers, brag and gossip about your city, ask questions and help each other. It was a huge hit relative to the amount of effort it took to implement, and despite its clunky stripped down user interface, but it demonstrated how essential and engaging communication between players was, even if they couldn't actually affect each others games or share save files.

https://en.wikipedia.org/wiki/Mike_Perry_(game_developer)

http://www.andywest.org/pr/simcity/misc/introduction.html

>Meanwhile SimCity Classic did not die: Maxis/EA has transformed it into a Web game you can play from your browser (after registration). The Web SimCity plays just like the computer game, except that it does not have every feature. Also, since the game is an ActiveX control, you must use Internet Explorer to play it.

I also designed and produced two different educational versions of SimCity/Micropolis, one an OLPC XO-1 Laptop interface for kids (X11/Cairo/PyGTK/Sugar) and one a web client/server interface (Python/TurboGears/AMF/OpenLaszlo/Flash) for an online community around shared multiplayer games and a storytelling platform. I completed MVP first-cuts, shipping the single player X11 version on the OLPC, and a web interface supporting multiple players sharing simulations running on the server.

https://github.com/SimHacker/micropolis/blob/master/turbogea...

https://github.com/SimHacker/micropolis/blob/master/laszlo/m...

I haven't fully completed the long term plans for the web based version, because Sugar never panned out then Flash died, and I didn't have the resources or support.

Finding support for developing educational software is tough, and I'd rather contribute new stuff to a truly free open educational community inspired by Seymour Papert's philosophy, like Snap!, than sharecrop on EA's intellectual property.

https://snap.berkeley.edu/

I believe Seymour Papert's collaborative constructionist educational ideas can be applied to other city simulators, many other types of games, and especially visual programming environments like Snap!

It's fortunate that SAP supports Jens Mönig to continue building on top of the block based visual programming ideas that he, Brian Harvey, John Maloney, Yoshiki Ohshima, Alan Kay, the MIT Media Lab Scratch project, and others developed:

https://faberllull.cat/en/resident.cfm?id=38258&url=jens-mon...

Micropolis Online (SimCity) Web Demo:

https://www.youtube.com/watch?v=8snnqQSI0GE

Micropolis: Constructionist Educational Open Source SimCity:

https://donhopkins.medium.com/har-2009-lightning-talk-transc...

Plan for developing Micropolis for OLPC:

https://github.com/SimHacker/micropolis/blob/master/micropol...

Micropolis for OLCP Sugar User Interface:

https://github.com/SimHacker/micropolis/blob/master/Micropol...

    Notes on adapting Micropolis to the OLPC Sugar user interface:

    Core Ideas:

      Activities, not Applications

        First cut: 

          Integrate the current TCL/Tk version of Micropolis to run as a simple activity within Sugar. 

            Restructure the multi-window TCL/Tk code to run in a single full screen window.
            Implement a simple activity-oriented tiled window management interface. 
            Disable advanced features like multiple editor and map windows, 
            that require more sophisticated window management. 
            Instead of using a traditional multi-window management approach, 

            Make a simple wrapper around it that makes it appear in the Sugar user interface as an activity, like eToys does.

          Long term:

            Implement activity specific modes that reconfigure the user inteface (like Eclipse "perspectives").
              - build/edit oriented interface
              - query/analysis oriented interface
              - financial oriented interface
              - communication/coordination oriented interface
              - dynamic zone finder analysis
              - grid of several overall map views, each configured to show a different overlay. 
              - grid of several close-up map views, each centered on a different parts of the city (or tracking a player's cursor)

            Collaboration: Enhance multi player mode to support sharing activities.
              Both publishing your game for others to clone and play themselves (massively single player, like Spore),
              and letting others join in your game (like the current cooperative multi-player mode)). 
              Multi player inte

            Expression: Enhance chat, journaling, storytelling, and personalization aspects of the game. 

            Journaling: Record all events (both user edits and simulation events), chat messages and drawings.
              Checkpoint the game state, and implement the ability to deterministically replay time stamped 
              editing events into the simulation, so you can fast forward and rewind from any checkpoint to 
              any step of the simulation. 
              Enable players to write newspaper articles about the cities, with live links to a snapshot 
              of the simulation and a place on the map, related to the story. Other players could browse
              their published newspapers about the history of a city, and jump into that history at any time
              from any story. 

            Iteration: Checkpoint game save files, allowing players to rewind history, and try "what-if" experiments. 

      Presence is Always Present

        First cut:

          Enhance the current X11 based multi player interface to support presence, the grid network, and messaging.
          The current multi player interface runs a single Micropolis process on one laptop, 
          which connects to the local X server, and/or several other X servers on laptops over the net.
          Rewrite the "Add User" dialog to be grid-network aware. 
          Instead of asking for an X server DISPLAY screen, provide a list of friends on the network. 
          Send an invitation to play to friends on the network. 
          Rewrite the built-in chat interface to integrate with the chat system used by Sugar. 
          Improve the shared "white board" overlay, so kids can draw on the map in different colors, 
          enable and disable different overlays, save overlays with the map, add text to overlays, etc.
          Implement location based chat, by overlaying people icons and chat bubbles on the map. 
            Each player has a people icon "cursor" that they can move around the map (which follows 
            their selected editing cursor), and their chat messages show up in bubbles overlayed on the map.
            When you select an editing tool, you can type what you're doing with the tool, 
            other people will be able to watch you, and make comments on what you're doing.

        Long term:

          Rewrite Micropolis in terms of Python/GTK/Cairo, and take full advantage of the Sugar libraries and services. 
          Support sharing, mentoring, colaboration, voting, political dialogs, journaling, etc.
          Develop Micropolis into a exemplary, cutting edge demonstration of all that's great about Sugar. 

      Tools of Expression

        Micropolis is great at supporting personal expression, interpretation and storytelling, 
        and leveraging what the player already knows to make connections to new knowledge,
        and stimulating conversation, debate and analytical thinking.

        Develop a web based "Wikipedia" oriented interface to Micropolis, supporting colaboration, discussion, 
        annotation, history journaling, and branching alternative histories. 

      Journaling

        The "Micropolis Journal" could be realized as a web-based
        newspaper-like interface.

        Expose the multi player user interface through the web, instead of
        using X11.

        Automatically generate a newspaper for any particular time in a
        city's history, from the simulator events and state, combined with
        user written articles and chat messages.

        The newspaper has sections that present automatically generated
        snapshots of the information displayed in the various dialogs
        (graph, evaluation, chat, notices, etc), and stories about
        significant events (both user-generated and simulation-generated).

        Enrich the city save file with metadata including the chat and
        event journal, overlays, snapshots at different points in time (in
        a branching "what-if" tree structure), etc.

        In the Python version of Micropolis it will be easy to implement a
        web server based interface that lets users read the city's
        newspaper through the web browser, automatically inserting
        pictures of the map corresponding to particular events in time. An
        article about pollution going down could show a before and after
        overall map with the polution overlay, and stuff like that.

        Plug in modules to the simulator that analyze the state of the
        city and generate events for the newspaper to write articles
        about, including interesting stastical information and other
        parameters to insert into the story template. 

        Implement "online surveys" that let newspaper readers vote on proposals
        (expose the voting interface to web based users).

        Use OpenLaszlo to develop a rich graphical AJAXian web service
        based Micropolis interface, eliminating the need for the X11
        interface, and enabling all kinds of interesting interface
        customizations and mash-ups with other web services.
[... lots more at:]

OLPC-Notes.txt:

https://github.com/SimHacker/micropolis/blob/master/Micropol...

PLAN.txt:

https://github.com/SimHacker/micropolis/blob/master/micropol...

DonHopkins
0 replies
7h5m

Here's a proposal I wrote to Maxis in 2002 asking to license the rights to develop a commercial educational version of SimCity for Linux, based on the multi player Unix version, with the support of Columbia University. They didn't bite at the time, but Will Wright and I eventually talked them into actually making it open source in 2008:

https://web.archive.org/web/20110611185928/http://www.donhop...

>Educational Multi Player SimCity for Linux Proposal

>Submitted by dhopkins on Sat, 2004-02-07 14:27. C Cellular Automata Pie Menu Applications SimCity TCL Game Design

>Back in March 2002, Maxis told me they were interested in supporting the educational use of products like SimCity. Earlier, I had developed a multi player version of SimCity, which runs on Linux/X11, and was scriptable in TCL. Educators and researchers from Columbia University, MIT, IBM, Xerox and other educational and commercial institutions were excited about gaining access to this version of SimCity, and adapting it to teach and stimulate students' interest in urban planning, computer simulation and game programming.

>So I wrote this proposal and presented it to Maxis, but nothing ever became of it. But recently, Will Wright has been pushing EA to relicense SimCity under the GPL, so the OLPC project can use it. So it may eventually see the light of day!

[...]

More SimCity stuff from my old Drupal blog:

https://web.archive.org/web/20110611162810/http://www.donhop...

https://web.archive.org/web/20111109132608/http://www.donhop...

beart
2 replies
21h42m

I don't recall performance problems being the main concern for the last (final) Sim City, but I won't say you are incorrect. What I do remember is

1. shallow game play compared to its predecessors

2. demand for always online and proven lies about offline play not being possible because of the game architecture

3. invasive DRM, during a time when invasive DRM was on everyone's mind

4. launch issues which, combined with the always-online requirement, meant a solid "plop" of a release.

5. EA was already negatively viewed at the time by many PC gamers

It looks like the wikipedia article for the game mentions some of these, and other issues.

> SimCity's sixth major release was announced on March 5, 2012, for Windows and Mac OS X by Maxis at the "game changers" event.[31] Titled SimCity, it was a dramatic departure from previous SimCity games, featuring full 3D graphics, online multiplayer gameplay, the new Glassbox engine, as well as many other feature and gameplay changes. Director Ocean Quigley discussed issues that occurred during the development of the title, which stemmed from two conflicting visions coming from EA and Maxis. EA wanted to emphasize multiplayer, collaborative gameplay, with some of the simulation work conducted on remote servers, in part to combat piracy. In contrast, Maxis wanted to focus on graphical improvements with the new title. Quigley described the resultant title as a poor compromise between these two objectives- with only shallow multiplayer features, and a small city size limit- one quarter of the land area of previous titles in the franchise.[2][32]

> The game was released for Windows on March 5, 2013, and on Mac in August.[33][34][35] Medium would later refer to the release as "one of the most disastrous launches in history".[2] The game required a constant internet connection even during single-player activity, and server outages caused connection errors for many users. Multiplayer elements were "shallow at best", with departing players leaving abandoned cities behind in public regions. Users were unable to save their game- with the servers instead intended to handle this- and so when users were disconnected they would often lose hours of progress.[36] The game was also plagued by numerous bugs, which persisted long after launch.[37]

> The title was heavily criticized in user reviews, and developer plans for post-launch updates were scrapped.[2] EA announced that they would offer a free game from their library to all those who bought SimCity as compensation for the problems, and they concurred that the way the launch had been set up was "dumb".[38] As a result of this problem, Amazon temporarily stopped selling the game in the week after release.[39] The always-online requirement, even in single play, was highly criticised, particularly after gamers determined that the internet connection requirement could be easily removed.[40] An offline mode was subsequently made available by EA in March 2014, and a mobile port entitled SimCity: BuildIt was released later that year.[41][42][43]

> It has been suggested that the poor performance of SimCity was responsible for the 2015 closure of Maxis' Emeryville studios, and the end of the franchise.[44][45]

disconcision
1 replies
21h26m

indirectly. i'd say the largest single early complain about simcity 2013 was the small maximum city size you quote, which people attributed to perf-related restrictions

beart
0 replies
20h56m

Ahh, I was categorizing that under "shallow game play", but I can see your point.

darklycan51
22 replies
22h51m

Why did they make the game in Unity instead of UE5? I assume it's something around the likes of "Staff is more comfortable with Unity", right, so let's release a game that runs like absolute garb. because we want to use a specific engine.

Next time I wanna take out my graphics card from my PC I'm going to use scissors instead of screwdrivers because I'm more comfortable with them.

When did the gaming industry become a place where what people inside the company want matters more than the end user? People love to cry about developers not being responsible for the state the gaming industry is at right now, but I disagree, and this is an example. Had they simply responded with "We cannot do it with Unity" management would have had no choice but to switch to Unreal 5.

andybak
5 replies
22h45m

UE5 isn't some magic pixie dust and Unity isn't a curse from which it is impossible to recover.

Whilst some of the issues here are potentially the fault of DOTS and HDRP not being production ready (in combination at least) there's plenty of blame to go around.

Elsewhere other people are making gorgeous and performant games in Unity.

Arelius
4 replies
22h40m

Yeah, and city builders have a very unique set of constraints that would prove to be a misfit for most modern off the shelf engines,

And remember, despite Epic's efforts, UE is still a first/third person engine at it's core. I wouldn't nr surprised if the challenges using Unreal would be at least as great if they had chosen that.

darklycan51
1 replies
21h49m

Stormgate is using Unreal engine just for the graphics and it's doing pretty fine...

Arelius
0 replies
21h16m

I'm not sure that's super relevant.

Firstly, just because they can use Unreal doesnt mean that it isn't causing problems, even you state that they are using it juat for their graphics which inplies challenges.

It's a completely different game by a completely different team. And it doesn't appear to be released yet. So it seems very premature to be judging it based on runtime performance..

Now, personally, I would choose yo build an RTS l/Builder in Unreal over Unity over time. But that's almost entirely due to my experience with Unity the company and their antagonistic incentives towards their developers, and the ease in which it is to get source access and little to do with technical engine fit. Honestly, in my professional opinion, the design of Unity is more flexible in this regard, and I'd consider it a reasonable decision to try to use Unity from a technical engine design POV.

andybak
1 replies
22h37m

DOTS was actually a perfect fit for complex simulations (and the article hinted that it probably had a positive effect on CPU usage)

All the issues are around rendering and HDRP - which by the sound of things is still not properly integrated with DOTS.

Arelius
0 replies
22h34m

Fair enough, but Unreal's Renderer isn't the best fit for this sort of game either.

And their data based ecss is still behind DOTS for simulation last I checked

BillFranklin
2 replies
22h41m

The model LOD issues described in the article can be fixed without changing engine. It's slightly easier to hire Unity devs to work around Unity issues vs hiring Unreal devs because there are many more Unity devs (Unreal has 13% of the market). Plus Unreal takes a 5% royalty fee on all sales, vs a flat fee for using the Unity engine.

capableweb
0 replies
22h35m

I don't think was a huge concern for Colossal Order (the developer of Cities: Skylines) as the team is very tiny and they're not trying to grow like crazy. Seems they're happy being a smaller studio, so having 100,000 Unity devs available on the market VS 10,000 Unreal Engine devs makes less of a different (numbers made up, don't quote me on those)

Arelius
0 replies
22h36m

What is that comparing? Beware of sampling mismatch. Due to Unreal's use in AA and AAA and the size of those teams, I wouldn't be surprised to see the statistics reversed when you limit the talent pool to the set of people you need.

Arialonomus
2 replies
22h18m

I would argue that UE5 is even less suited to a game like this than Unity is. Unreal certainly has impressive rendering tech, and it has designs towards increasingly becoming a generalist engine, but it is clearly designed with certain genres in mind (i.e. 1st and 3rd person games like RPGs, Shooters, Action games, etc.). A city-builder in UE5 would present a whole host of other challenges, and many of the high-tech rendering features would likely be overkill. Not to mention, Unreal games have notorious performance issues of their own--though there is dedicated effort to resolving those.

Unity is designed more as a general engine, but it comes with a lot of baggage in terms of half-baked features and optimization difficulties. As the author mentions they really unlocked their potential with implementation of Unity's ECS framework, but they were still chained to Unity's rendering tech, which has been underdeveloped for several years now.

My observation tends to be that simulation games are the ideal case for custom engines. While there are some commonalities across games, compared to many other game genres, they don't get a lot of benefits from standardizations. Sim games often end up kneecapped by trying to conform to existing engine frameworks instead of spinning up something optimized to the way their systems work. It requires a lot more technical know-how than an action-adventure game or a platformer, and the up-front cost to developing your own tech is an order of magnitude compared to using out-of-the-box solutions. I think with the massive success of C:S, Colossal Order was in an excellent position to try something ambitious.

Maybe with open-source tools like Godot having more flexibility in their frameworks, where you can just get the parts you want (rendering approach, etc.), it'll be easier in future to develop more specialized custom tech for games.

Arelius
1 replies
21h10m

This for sure.

City builders, and certain classes of RTS are really the last major forms of games that really are very poor fits for modern off the shelf engines.

Honestly, trying to build one in either Unreal or Unity is going to be a painful experience with challenges likely surpassing having just built the engine you need in the first place.

But Engine selection is really only a technical decision less then half the time these days anyways.

darklycan51
0 replies
19h43m

Frost giant is building stormgate, the spiritual successor of starcraft 2, with even better/smoother gameplay than sc2 with Unreal Engine 5, they picked what parts they reused such as the renderer and built the underlying stuff from scratch, but still

spoonjim
1 replies
22h42m

> When did the gaming industry become a place where what people inside the company want matters more than the end user?

Happened everywhere. Now if you run an intense company where people are expected to work hard towards ambitious goals or be fired, you’re “toxic”

DonHopkins
0 replies
16h36m

It sounds like you're attempting to carry the water for Elon Musk. Good luck with that.

The very definition of a toxic brogrammer, Alex St John literally glorified the benefits of slavery and Microsoft's imperialistic dominance of game developers in his own words [1], describing his infamous Toga Party / Roman Orgy / Slave Auction he threw in the Spartan Stadium at the 1996 Computer Game Developer Conference (which I and my team from Maxis attended, and personally witnessed the slave auction and free silly string and live lions -- not all were caged, some were just tied up):

[1] http://www.alexstjohn.com/WP/2013/03/06/bunnygate-pt-2/ [offline and banned from archive.org]

>"Sex, violence and debauchery were all hallmarks of a great Roman “orgy” and I wanted to come as close to capturing that experience as I could get away with."

>"The slave auction was also a special occasion. The Roman guards were rounding up the losers from the gladiatorial events and anybody who was offending a “senator” or Cleopatra and throwing them in the slave pit. Once a sufficient number of slaves had been collected from the party, Gillian returned to the stage to auction them to the audience for the gold they had won in the games. Gillian could set any base price she wanted for a “specimen”, if the audience wouldn’t pay the price the slave would be “thrown to the lions”. We had configured a “pit” full of foam to one side of the stage and directly in front of the first caged lion. A failed slave would be marched to the edge of the pit at spear point and pushed in by the praetorian guard where they would fall into box full of foam below the audiences line of sight. Then a huge video clip of lions tearing up a wildebeest and roaring would be played on the big screens on either side of the stage accompanied by the sounds of screaming. Down in the foam pit we had a collection of fake bloody body parts that somebody would toss up in the air during the lion feeding. Of course the “victim” was then allowed to sneak out the back and return to the party."

Yes, the same Alex St. John who is so incredibly toxic that his own daughter publicly denounced him and his "horrific toddler meltdown" and "toxic waste trash fire":

I Am Alex St. John’s Daughter, and He Is Wrong About Women in Tech (medium.com/milistjohn):

https://news.ycombinator.com/item?id=11542568

https://medium.com/@milistjohn/i-am-alex-st-john-s-daughter-...

>My name is Amilia St. John and I am the daughter of Alex St. John. Yes, that one. For those not following the horrific toddler meltdown my father has been very publicly broadcasting over the past few days, here is a short summary; My father, posted an article recently on venturebeat.com claiming that:

>“Many modern game developers have embraced a culture of victimology and a bad attitude toward their chosen vocations.”

>and how:

>“[he] can’t begin to imagine how sheltered the lives of modern technology employees must be to think that any amount of hours they spend pushing a mouse around for a paycheck is really demanding strenuous work.”

>— The usual self aggrandizing agenda that older generations like to peddle on days when they need to feed their superiority complexes. My father’s article led to a massive outcry from the gaming industry and a subsequent invasion of my father’s blog by the (rightfully) angry internet masses.

>On the blog, they uncovered extremely distasteful recruiting slides and supplemental blogs with revolting opinions regarding women, minorities and those with autism in the tech industry. Since these findings, countless others and I have found ourselves at a loss for words how anyone, especially someone in a position of power, can think that it is acceptable to broadcast such offensive material.

>As his toxic waste trash fire not only is associated with my last name but also my face, I felt compelled to respond to my father’s sexist, ableist, and racist rants. [...]

Alex St John’s Ideas About Game Development Are Terrifying:

https://www.kotaku.com.au/2016/04/alex-st-johns-ideas-about-...

>Earlier today you would have seen the story how Alex St. John, the creator of DirectX and founder of WildTangent, came out over the weekend trying to recharacterise the conditions of game development. It’s art, not a job, and game developers shouldn’t be as concerned with stress, work-life balance or fair wages. Those were some of the arguments St. John made, although you should read them within to understand the full context.

>St. John also has strong views on who precisely game studios should hire. It’s contained in a presentation called “Recruiting Giants” and suggests, among other things, that coding isn’t actual work, “real” programmers don’t value money and that working juniors and interns so hard they burn out is “good for them”. [...]

Dispelling some myths about the autistic wunderkind programmer:

https://www.gamedeveloper.com/business/dispelling-some-myths...

>In April 2016 Alex St. John, one of the developers behind Microsoft's DirectX technology platform, wrote an article in which he defended eighty hour working weeks, scorned work-life balance and memorably described a porcelain toilet as an "incredibly decadent luxury." St. John’s article was widely criticized for promoting what many consider to be exploitative working practices and attitudes in the video game industry.

>In the ensuing outrage, a PowerPoint presentation written by St. John surfaced which included a slide in which he referred to engineers with Asperger syndrome as being "the holy grail" of hires. "They work like machines," he wrote, "don’t engage in politics, don’t develop attitudes and never change jobs." [...]

Xeamek
1 replies
22h37m

What makes You claim that game like this can't be made in Unity?

Cause honestly this sound like the typical "Unity bad, UE with their flashy trailers good". But maybe You actually do have a valid reasoning, please share it then

jokethrowaway
0 replies
21h51m

The rendering pipeline for DOTS is incomplete.

The studio had to implement it.

Sounds serious enough to try something else, especially given Unreal has Nanite. I think on the ECS side they're lagging behind though

wilde
0 replies
22h43m

It is very possible to make a game that runs like shit in Unreal. Immortals of Aveum comes to mind.

twodave
0 replies
22h47m

I think self-preservation probably plays a role here, too. If you think management will just fire you and hire someone who will claim they can build it in Unity, then maybe you decide this isn’t the hill you want to die on.

solardev
0 replies
22h45m

This is a sequel to Cities: Skylines 1, which was also written in Unity. It was probably easier to reuse code (and personnel) for the sequel without switching to a totally different engine?

Maybe for CS:3 :)

raytopia
0 replies
22h40m

UE5 is no magic bullet.

coffeebeqn
0 replies
22h32m

Sounds like the biggest problem was an insanely beginner mistake - I’m guessing they had inexperienced interns model most of the buildings so the poly counts are absurd. And no one checked the poly counts before putting them in the game? I’m a little confused how they screwed this up after already shipping Cities 1 which didn’t have any major issues like this

WhereIsTheTruth
0 replies
22h22m

Unreal is not the panacea

The problem is the developers more than the engine choice

Most of the AAA released this year had performance problems, most of them were built using Unreal 5

candiddevmike
21 replies
23h1m

> Quite a few YouTubers and streamers got early access to the game, but they were explicitly forbidden to talk about performance until the regular review embargo was lifted.

This, along with games like Alan Wake 2 getting phenomenal reviews in the face of terrible performance and game breaking bugs, makes me wonder why folks trust the current corrupt review system, of which streamers are now part of too.

solardev
4 replies
22h52m

Alan Wake 2 looks a lot better, though. From the article:

> As a comparison similar hardware in Alan Wake 2 — which was released the same week as C:S2 and is considered by some to be the best looking game of this console generation — reaches comparable average framerates with all settings cranked including path tracing, either at 1440p without any upscaling magic or at 4K with some help from DLSS. I think that’s a good illustration of how bizarrely demanding C:S2 is.

I love CS:2, but it does run noticeably worse than every other game I have (using GFN's RTX 4080, which normally never lags) with outright stutters lasting a few seconds at a time, interrupting whatever I was doing. And it does that without looking really any better than its predecessor. And even if you turn down all the settings, it still lags quite a lot.

I think most people playing city builders don't necessarily demand super-next-gen graphics, but performance that can keep up with the growth of their cities.

I still gave CS:2 a positive/thumbs-up review, but I can understand why so many people are frustrated with it. Launching with such a limited number of building options and no editor or mod support was also kinda a let-down. Still, I'm excited about the foundation they've made in 2, and think that it'll be awesome in a few years' time.

candiddevmike
3 replies
22h49m

This is why steam reviews are nice. It's fine to review something negative out of the gate and review it more favorably once it's in better shape. This helps other people more than rating it positively.

solardev
2 replies
22h44m

Yeah, I love the review score histogram over time feature too, along with the "Overall reviews" vs "Recent reviews" summaries.

0cf8612b2e1e
1 replies
22h17m

I wish there was a way to discount the first month of reviews from the overall score. I assume that the people who buy, play, and review the game immediately are the super fans who have extreme views on what makes the game good.

solardev
0 replies
22h14m

There is! You can click and drag on the review date graph to any custom timeframe you want. Then it'll calculate the overall rating of just your selected date range for you.

I wish I could post an image here for you :( But basically just go down to the reviews, expand the graphs (with "Show Graphs"), and click and drag a box on the left hand one.

FireBeyond
4 replies
22h25m

> makes me wonder why folks trust the current corrupt review system, of which streamers are now part of too.

Agreed. There's such a whole world of unreported 'sponsored' or otherwise products, and streaming is a big part of it. And no-one is immune.

When the cheesegrater Mac Pro came out, I watched a lot of videos on it, particularly on YouTube in the photo/video segment - I was planning to get one, and while I had other uses for it, I'd be doing a lot of photo work on it in my recreational time.

Quickly I noticed just how many of the big name streamers had launch day or very early access to the Mac Pro and Pro Display. Sure.

And then I noticed how each and every one spun it as "I just got mine", "just bought one", and so forth. All organic, they'd have you believe - not a single one said "Apple sent me this". And yet...

By "a curious coincidence", every single one had seemingly ordered the exact same spec: an 18 core CPU, 384GB of memory, the Vega II Duo GPU, and 8TB SSD, and the nano-textured ProDisplay.

So what, you might think, that might have been the quickest shipping order. Also an $18,000+ computer, $25K with the display.

And if you're a photographer, even if you're working on medium format digital, and 100MP images, you in no way shape or form need 384GB of memory, or that GPU. For me, LightRoom / Capture One and Photoshop all barely sweated on my 12 core 192GB W5700X variant.

So then Occam's Razor applies. What are the odds that, even of just the 8-10 streamers I watch, they all got exactly the same spec Mac? Or is it that that was the spec Apple was sending to high popularity streamers?

Except not a single one even implied that that might have been the case. And I don't doubt that many or all bought their own at some point. But I suspect it was mostly "got one from Apple, talked it up, and then substituted it with my own when it arrived".

zf00002
1 replies
22h2m

I've been getting into sim racing lately and a majority of the youtube reviewers do this thing where they claim it's not a sponsored video but the company sent them the product for free. And that's the same problem when it was magazines publishing glowing reviews of mediocre products just so the gravy train of free stuff doesn't end.

FireBeyond
0 replies
21h55m

Those would be somewhat better if they made a point (I've seen some in the photography world do it with smaller things) of "and after I go through a review, I'm going to give it away to a viewer/follower..."

throw3823423
0 replies
21h34m

And it gets worse the smaller the market is: There is a chance that a youtuber with sufficiently large following could actually choose to buy said mac pro, because their revenue might be pretty large. But then you look at, say, boardgame reviews. Nobody, ever, buys a game. But the number of views isn't good enough to dedicate the time to it as anything other than a hobby. Thus, anyone posting enough that they make it their job is also getting sponsored on top of the free product, but nobody wants to tell you that. Thus, all you are seeing is 100% ad, just shaped as a review, or as entertainment.

DonHopkins
0 replies
19h5m

Hey, it worked for Jerry Pournelle and Robert Scoble ...

make3
2 replies
22h15m

it should be illegal imho. You straight up shouldn't be allowed to tell reviewers what they can and can't review

perihelions
1 replies
20h53m

Strong agree. It should be core FTC rulemaking to prohibit this; it's a form of advertisement masquerading as independent review.

(If the subject of the review sets conditions on its content, to restrain the reviewer from discussing their negative observations, it has the character of an advertisement. The subject is enforcing editorial control; thus, they have partial authorship/editorship of their own "review". Either you sign your name to your *ad*, or, the reviewer signs their name to their unburdened conscience—there is no in-between).

jsnell
0 replies
18h57m

What's being described isn't an example of conditions being set on the content of the review. It's an embargo date on when a review may be released, and a restriction of what may be publicly disclosed before the review is published.

There's nothing wrong about that, and in fact forbidding embargo dates would have a pretty bad outcome. It would result in reviewers getting no early review copies, and having to rush out shoddy reviews ASAP after the release. Likewise the customers would have no access to reviews on the release date, and would either need to wait or buy the game blindly.

purpleflame1257
1 replies
22h41m

Gamergate, to a certain extent, was indeed about ethics in game journalism. It just got hijacked into culture war bullshit like everything else.

But who cares? As long as the sheep keep preordering the music won't stop.

DonHopkins
0 replies
16h23m

GamerGate was pure unadulterated juvenile misogynistic right wing culture war bullshit from the start, and had absolutely nothing to do with ethics or game journalism, and everything to do with intolerance and harassment and misogyny and abuse, and you know it, so stop trolling.

wincy
0 replies
21h6m

I’m running Alan Wake 2 on an RTX 3080 with a 7800X3D and the first few sections in the forest (don’t think describing forest scenes in Alan Wake 2 is a spoiler here) ran around 30fps, and then the absolutely sumptuous next few levels (man the maps are gorgeous) have been running at 60fps rendered at 1080p upscaled to 1440p.

It didn’t seem like turning down settings caused a huge increase in frame rate so I just put everything to max and man it looks great. So you at least feel like there’s a reason your graphics card is running at 82°C and your fans are spinning up.

On the other hand, my wife was a huge Cities Skylines 1 player and bounced off the game pretty hard, as the low frame rates doesn’t come with any sort of real upside.

squidsoup
0 replies
22h50m

Alan Wake 2 performs well on PS5, and I haven’t encountered any “game breaking bugs” (about 3/4 of the way through). There’s no conspiracy, it’s a fantastic game.

madeofpalk
0 replies
21h57m

Internet now is full of non-journalists ("content creators") producing what people substitute for journalism.

gorbachev
0 replies
21h14m

It's always been like this. Long before streamers got involved. Video game magazines (the paper ones...remember those?) had the same exact process for reviews. Game publishers expected certain things in return for access to future game releases and most video game magazines cooperated.

Every two years or so there was a scandal about either reviewers calling out a game publisher about the bullshit, or a widespread backlash against obviously lacking reviews.

doikor
0 replies
21h47m

> Alan Wake 2 getting phenomenal reviews in the face of terrible performance and game breaking bugs

There is no terrible performance and the game breaking bugs are very rare (as in none of my ~10 friends who bought the game ran into any of them on their first playthrough). Yes it is a demanding game but it is also one of the best looking games ever made.

Yes the game is demanding if you max out everything but as it runs at 30/60 fps in quality/performance modes on PS5/Xbox Series consoles it does run at those (or better) framerates on PCs very easily.

Only case when you get bad performance is when you use incompatible hardware (old graphics cards without mesh shader support)

TazeTSchnitzel
0 replies
22h50m

Some reviewers stand up to this kind of stuff (sometimes at great cost). You can choose to follow and support those reviewers. But the general rule is to simply never pre-order stuff.

alkonaut
21 replies
19h4m

Making a AAA game sounds horrible. And making one in Unity sounds doubly so. All of these things sound like fixable issues. They'll likely be fixed. Hopefully the developers made these oversights because they focused on what makes the game worth fixing to begin with: that the gameplay is fun. In many recent games I feel developers have focused on the wrong things and totally forgot the core, meaning if they fix the bugs - the game is still quite hollow (cough, Starfield). A game like this should of ocourse have a continous perf process and if it doesn't run ok (min 30fps on median hardware 60 on enthusiast hardware for example) then it just shouldn't ship. I wish more studios would stop having crunchtime for meaningless deadlines such as holiday seasons. Someone has said "it's ok to be just 10fps on beefy hardware, we can fix that later, let's ship it now".

Dah00n
13 replies
18h39m

>Someone has said "it's ok to be just 10fps on beefy hardware, we can fix that later, let's ship it now".

Well, yes, because it doesn't matter. It is on the top seller list on Steam. I agree with you, but we can discuss fixes till our fingers bleed. In the end, the problem is capitalism.

https://store.steampowered.com/search/?supportedlang=english...

christophilus
4 replies
18h2m

> In the end, the problem is capitalism.

Is this sarcasm? I’m asking seriously. If not, then how is a poorly running game the result of capitalism, and what is the alternative economic model that would produce only high-performance / efficient games?

johnnyanmac
3 replies
12h27m

I don't think it's sarcastic. The game runs poorly because it needed more time for optimizations. It doesn't get more time for optimiztions because the publisher said it needed to ship now. The publisher can say that it needs to ship because they can advertise for good launch sales, because the a large portion of the customer base will buy it at launch as long as there aren't obvious showstoppers.

It's a bit trite to sum that down to "capitalism", but sure. it's the underlying societal buzzword issue

gruez
2 replies
6h26m

Corner cutting doesn't happen in socialist countries?

johnnyanmac
1 replies
6h23m

In a truly socialist structure, corner cutting comes from prioritizing other duties to the people, directly or indirectly. Capitalistic pressures come from money.

But there is no pure capitalistic structure nor socialist one. I did already mention it was a trite comparison.

gruez
0 replies
5h38m

>In a truly socialist structure, corner cutting comes from prioritizing other duties to the people, directly or indirectly. Capitalistic pressures come from money.

You can argue that's the same under capitalism, just replace "the people" with "customers". In both cases corner cutting is only an issue if there's deception involved, wether that's a Party member using lower quality materials for a construction project, or a video game company spending less time polishing a game. I don't see a problem with releasing unfinished/poorly optimized games as long as it's clearly disclosed. Early releases are basically this.

mcmoor
3 replies
17h34m

I've learnt that (initial) success of a sequel is 95% because of its prequel. True performance may be seen if it has another sequel, or DLC, or heck, another 3 months.

johnnyanmac
2 replies
12h30m

That's the benefit about continual updating games. By the time CS3 is ready people won't remember CS2 2023 but whatever 3-6 years of updates does to CS2. For a modern example: who's still complaining about Cyberpunk in 2023?

It's not like Sonic 2006 that is forever broken, sold decently at launch and then cratered the series for the next decade to come.

mcmoor
0 replies
12h5m

There's still a risk of the game cratering completely like Imperator, although it's not a sequel to an established title. Sometimes I'm still worried how Victoria 3 's fate would be.

alkonaut
0 replies
8h28m

> For a modern example: who's still complaining about Cyberpunk in 2023?

Those who bought it and played it in 1.0! Many of them never saw the fixed game. They weren’t helped at all by the fact that the game was quite good 6 months later. Worst case they’ll never buy a CDPR game again, best case they’ll buy the games after 6 months (which by the logic of rushing releases is disaster because apparently not selling millions the first holiday season is a failure).

drstewart
1 replies
17h39m

The problem is European capitalism, as the game developer here is Finnish and publisher is Swedish. Important clarification since you seem very invested in calling out the US in every other post of yours, so I'm sure you appreciate the calling out of systemic European failures here.

heywhatupboys
0 replies
9h57m

The Soviet Union produced some of the best video games!!

heywhatupboys
0 replies
9h57m

> In the end, the problem is capitalism.

surely, it is the underlying economic system ruining the art of modern video game development!

alkonaut
0 replies
17h31m

That’s still not a sufficient condition for success. Longer term the studio brand will be hurt, not least.

brucethemoose2
6 replies
17h10m

> then it just shouldn't ship

Part of the problem is that AAA is just (IMO) too big and expensive. Devs might actually have to ship a broken game around holiday time just to get enough sales to survive.

And the other extreme can be dangerous too, like how Mass Effect Andromeda's development dragged on forever, and EA let it happen because its such a golden IP.

I think the ultimate solution is to just scale down most studios a little bit, so the studio and publisher can afford to delay. Medium sized studios are the sweetspot, especially going forward with GenAI.

chii
2 replies
16h3m

If a studio downsizes, they might be able to survive making lower budget games. And while it's true that if every studio does this, the industry would get better, there's incentive for _one_ studio to pump up their funding and make a super-high-fidelity game, and grab all marketshare.

This makes it a 'race to the bottom' style (or a race to the top?) competition, where higher funding gets you more marketshare, but only against lower funded studios. It's akin to advertising budgets. Mostly a zero sum game in the end.

brucethemoose2
0 replies
15h49m

> super-high-fidelity game, and grab all marketshare

This is not necessarily true any more. I think smaller studios chipping away for a long time are making better games than most big ones.

BeFlatXIII
0 replies
3h13m

Hopefully this dynamic creates a cascading market crash to wipe out all the biggest players and force the industry back to its proper size.

3seashells
1 replies
9h52m

EA were the culprit for those delays by forcing bioware to use frostbite. Blame rests solidly with some synergies savings suitsuperman.

brucethemoose2
0 replies
1h45m

Going by the allegations in the reports, it wasn't just frostbite. Bioware chose to dump a lot of time into features like procedural generation (and having to use Frostbite exhasterbated this problem even more).

SXX
0 replies
9h20m

It's not like this is unique to AAA. Unless gamedev is your hobby and not full-time job you simply have no choice other than ship game and try to fix it.

Any small indie studio have to deal with it.

inoffensivename
17 replies
22h51m

I spent 40 minutes trying to eke out more than a handful of fps on an empty map with the resolution set at 1080p with Proton Experimental. I gave up and got a refund, I'll try again if they fix the awful performance.

I got a tremendous amount of enjoyment out of the first instalment of the game, it's a big bummer that I can't give this one a go

solardev
13 replies
22h48m

It's pretty playable on GeForce Now, for what it's worth. Still a big laggy, but I was able to play for many hours without major issues... just the occasionally annoying but livable stutter.

Unfrozen0688
8 replies
19h58m

Dosent really count as it is not rendering on your machine... ofc its good there.

solardev
7 replies
19h45m

So? That's even better. Doesn't use my battery life or create noise & heat. Netflix isn't run on my machine either.

smolder
3 replies
19h14m

It just uses natural resources to outfit and power data center stuff to create heat and noise somewhere further away. Netflix... is fairly efficient, though being on-demand, perhaps much less so than broadcast TV.

solardev
2 replies
16h40m

Yes, but better that, in a shared environment with center-wide cooling and such, then each individual household needing to do that on their own.

Also way fewer cards needed this way, with users being able to share cards through the day instead of each needing their own.

Basically mainframes all over again :)

Chaosvex
1 replies
10h34m

> Yes, but better that, in a shared environment with center-wide cooling and such, then each individual household needing to do that on their own.

I don't think this actually tracks, unless the heat is actually being put to use. You don't need HVACs when you have the machines distributed.

solardev
0 replies
4h35m

Why don't you need distributed HVAC? A gaming desktop can use a kilowatt of power, which is what a small space heater might put out. It often will make the room uncomfortably hot. (Anecdotally, this is part of why I switched to Geforce Now, myself. My apartment got incredibly uncomfortable from the heat. It's an older unit with no air conditioning.)

In data centers, sometimes (not always, and perhaps not often?) the heat can be more efficiently handled through central heat exchangers, more efficient commercial HVACs, etc.

Unfrozen0688
2 replies
19h44m

Sure, but then it does not have any relevance to the article.

solardev
0 replies
16h42m

Wasn't supposed to challenge anything in the article. Just an option for those who are struggling to play it on their current hardware.

Even with GFN it's laggy and stutters. Totally agree with the article.

jodrellblank
0 replies
16h40m

I wasn't claimed to be relevant to the article, it was a reply to "it's a big bummer that I can't give this one a go" with a suggestion of how they could play it.

JCharante
3 replies
21h31m

GeForce Now has been amazing as a mac only user

Unfrozen0688
1 replies
19h58m

Yes because you have no other options.

solardev
0 replies
19h46m

There's lots of options? GPT, WINE Crossover, Luna, Boosteroid, Shadow.tech... none of them run as well as GeForce Now, though. Or a dedicated gaming PC.

solardev
0 replies
21h14m

Same.

I have a M2 Max and GFN is much much easier than trying to set something up with GPT (Game Porting Toolkit) and Whisky, and much faster & quieter too. An RTX 4080 running in their data center means no local heat and noise.

DonHopkins
1 replies
19h35m

Try not fluoridating the water, defunding the dentistry college, and subsidizing sugar, so everyone's teeth fall out. Runs much faster then!

[Chill: it's a tooth joke, not a conspiracy theory.]

dieortin
0 replies
18h0m

Water is not fluoridated in most of the world, that’s one of many weird things in the US

deanCommie
0 replies
18h28m

The fix fits into a tweet -> https://twitter.com/ColossalOrder/status/1716883884724322795

> If you're having issues with performance, we recommend you reduce screen resolution to 1080p, disable Depth of Field and Volumetrics, and reduce Global Illumination while we work on solving the issues affecting performance.

This is all I had to do to get smooth performance on an AMD Radeon RX 5700 XT

Havoc
13 replies
22h26m

The odd thing is I doubt a single sky line fan thought “what this really needs is graphics so intensive I can’t build a big city”

It’s remarkably tone deaf

capableweb
9 replies
22h24m

I also doubt any of the developers were aiming for that too.

I've been looking through the decompiled code for the purposes of modding for the last few days, wrapping my head around their ECS/DOTS code, and the game has a much better foundation for a scalable simulation than CS1 ever had, even after years of optimizations.

lamontcg
4 replies
21h59m

What probably happened is that certain dev teams built ridiculous things on top of the ECS/DOTS code to see how much they could push the detail. So you wound up with things like the teeth. Which was probably some internal demo-driven development that everyone 'wow'd at the fact that it could be done. Times 100x or something like that. With nobody owning overall performance or whoever owned the overall performance didn't have any management support before launch because it was deprioritized. Nobody gets promoted for ripping out someone else's perf-expensive features and being the bad guy, you get promoted for building new clever features on top of the shiny new engine.

paavohtl
3 replies
21h56m

> So you wound up with things like the teeth.

Author here. The teeth are completely unrelated to the simulation and not even really related to pushing maximum graphical fidelity. They are just using completely unoptimized - possibly stock - character models, which include a mouth with teeth even thought the characters never open their mouths.

reactordev
1 replies
21h50m

But they are also using LOD... as per your article [0]. So it's quite possible (you have the source, go look) that they call this lib with the vertex buffer of the mesh in question and get back an LOD3 mesh that's decimated and simplified but with the projected UV's. This is kinda how Unity wants folks to do "AAA" in Unity. Use a high resolution model and decimate LOD's in real-time like Unreal Engine does. Only, it doesn't work half the time. It requires that the original high poly mesh be properly UV wrapped and not vertex shaded. So to further throw gas on the fire. It's entirely possible instaLOD is the source of a lot of these issues. Combined with what you found in the rendering stages.

[0] https://instalod.com/

paavohtl
0 replies
19h42m

InstaLOD is only used in the asset pipeline. I haven't seen any references to real-time use. Haven't found any evidence of real-time decimation either.

To be clear there are LODs, but only for some of the meshes.

lamontcg
0 replies
21h45m

The teeth certainly aren't helping anything, and something has to make a decision not to render them somewhere, and they're wasting disk and RAM if nothing else.

Also think that's a distraction from my point, so just pretend I wrote "100,000 vertex logpiles" instead of teeth.

The teeth are just the most obviously useless unoptimized thing they shipped that nobody had the time/will to cleanup before launch.

MagicMoonlight
1 replies
22h6m

I bet whoever worked on the actual simulation worked really hard for years to make it run efficiently.

Then the meme developers in the front end went “lmao JS and CSS for the interface, 100k for a log” and ruined it all

capableweb
0 replies
21h42m

Not sure how the "frontend" (UI) developers have anything to do with the assets, but it's a easy scapegoat I guess...

Havoc
1 replies
21h36m

That’s cool. Didn’t realise modders still do that sort of low level stuff

capableweb
0 replies
21h31m

Considering Paradox/CO are dragging their feet to release the modding docs/tools, we don't have much choice if we wanna start building our mods today :)

huytersd
1 replies
21h53m

No, we can have both things. Graphics are incredibly important to a game, especially for a simulator like this, it really helps with the immersion. It’s as important as gameplay.

Havoc
0 replies
21h37m

> It’s as important as gameplay.

Important certainly but colour me unconvinced on equally important. If the game play isn’t fun on a simulator then it isn’t much of a game.

People still play chess despite terrible graphics

lawn
0 replies
22h18m

I wouldn't even care if the graphics were exactly the same as in the first game.

It's all about the gameplay.

EMM_386
12 replies
22h23m

> A brief glance at the JS bundle reveals that they are using React and bundling using Webpack. While this is something that is guaranteed to make the average native development purist yell at clouds and complain that the darned kids should get off their lawn,

When I first heard of this as being a thing, my initial reaction was indeed something like "wait what? HTML and CSS in a desktop PC game driving the UI? No, that shouldn't be ... ".

But then I used Microsoft Flight Simulator 2020, was amazed by the graphics and performance ... and learned about how the complex and detailed cockpit gauges are using WASM/HTML/JS for rendering. No React, but still the "web technologies".

It dawned on me that this is apparently not exactly the weird, strange, bad performing, wrong-use-case, "why would you ever" thing I originally saw it as. Because it was working fine in that complex scenario.

lyu07282
3 replies
21h45m

I imagine lots of backend developers never really thought about what it actually takes to build complex, responsive, performant and visually appealing UIs and how the web platform just so happened to have matured for decades to make all of that as painless as possible. Last time I saw that reaction was someone learning about how SpaceX is using it for their flight controls in the Dragon capsule.

mattlondon
2 replies
19h51m

> I imagine lots of backend developers never really thought

There is a lot of snobbery and arrogance from "pure" backend developers who think JavaScript is some limited toy language. If it is not C++ or Rust or even Go (...assuming they are ok with the GC) then to them it is not worth wasting a milliseconds time on while they go off and fetishise over their copy semantics for their CRUD website backend.

Modern JavaScript is highly performant and the DX is second-to-none (unless you are using anything related to NPM). JavaScript and Typescript freed from the NPM nonsense is an absolute joy to work with.

I look forward to the inevitable dominance of JavaScript on the backend.

polishdude20
1 replies
19h11m

So... Node?

mattlondon
0 replies
4h41m

Deno, bun, or something else that hasn't been created yet I would hope. NPM has shat the bed for Node et al IMHO

davedx
1 replies
21h45m

All sorts of industries use web tech to render UIs. Even SpaceX used it in Dragon. People just have preconceptions and biases, and gamers are horrendously intolerant of everything and anything that doesn’t fit their world view of “how things should be”.

EMM_386
0 replies
20h29m

> All sorts of industries use web tech to render UIs. Even SpaceX used it in Dragon.

When I realized this was performant enough to drive a 747's entire cockpit while the simulator moved along at a high FPS and with incredible visuals, and was now a thing that was being selected for "state of the art" AAA games ... from there I did my homework.

So that led to reading all about SpaceX Dragon UI, as you mention, and all the myraid of other places this is used.

I did feel like I was too far outside the loop, having been a software engineer and working with these technologies for quite a while by that point. "quite a while" being the pre-JavaScript-existing years.

I just wasn't working on projects that such a thing would meet a requirement.

Now? I still have no use for it but I find it incredible that everything in this entire pipeline has become so optimized and reliable. Apparently to the point you can suggest running a spacecraft's user interface and not leave everyone staring at you, blank-faced and not sure how to respond.

Instead, possible responses can now include "sounds good".

starkparker
0 replies
19h48m

Electron's bad performance really poisoned that well early on.

Gameface's architecture is described in its documentation: https://docs.coherent-labs.com/unity-gameface/integration/te...

Most of its divergence from browser implementations are unsurprisingly in font rendering, which is generally a headache in games anyway: https://docs.coherent-labs.com/unity-gameface/integration/op...

And in RTL text rendering: https://docs.coherent-labs.com/unity-gameface/integration/op...

The Javascript DOM API is a subset, but pretty rich considering: https://docs.coherent-labs.com/unity-gameface/api_reference/...

solardev
0 replies
20h23m

Just wait for Boeing to start making real cockpits in React...

rtpg
0 replies
16h44m

People like to whine about web stuff being for babies, but writing dynamic UIs in "normal" things is very messy.

Now, there is stuff like imgui of course. But Try writing a very heterogenous UI in something like Qt and then come back to React. There's very little empathy for the machine but it works so well.

polishdude20
0 replies
19h12m

Do MSFS menus use html as well? Because they are 100% the worst most laggy menus I've ever used.

nercury
0 replies
5h26m

I would not sing praises for Microsoft Flight Simulator 2020. First, the cockpit displays are rendered at lower framerate, because they take considerable chunk of frame time. Second... It may be fine to render display and UI using js, but if you dig in, you will also find the autopilot there on top of inheritance hell.

JCharante
0 replies
21h39m

Battlefield 1 was using React to render the UI in 2016!

Conference talk by dev: https://www.youtube.com/watch?v=Pkf9H3XEMoE

torginus
11 replies
11h1m

Unity is a clown engine. I remember some guy benchmarked DOTS, against plain old C++ code with OpenGL, just like your dad used to write, and it was no contest, DOTS couldn't keep up.

You have to remember, Unity usually sets the bar very low for themselves, comparing their ancient Mono implementation (they still use the Boehm GC, written in the 1980s!), and when a shiny new performance 'fix' like Burst/DOTS drops, they proudly proclaim how much faster they managed to make things, never mind a modern .NET implementation, like Microsoft's CoreCLR would achieve the same performance without any proprietary tech, without any code modification.

Varriount
8 replies
10h44m

Huh, I never knew Unity used its own CLR implementation. Any idea why?

heywhatupboys
4 replies
10h0m

Unity is older than the Core runtime. They had to use Mono as their base.

fulafel
3 replies
9h11m

Assuming they had C# or .NET as a constraint. I'd wager not, they just wanted something that would work for "scripting".

geon
2 replies
6h47m

C# and their own custom clr language was the only supported languages, so yes.

fulafel
1 replies
6h39m

There were lots of other languages with support available.

I don't think Unity had anyone mandating they had to use .NET or C#.

speps
0 replies
5h55m

It used to be C# along with JS and Boo (a Python for .NET). All docs had examples in order of preference for JS, C#, Boo.

torginus
2 replies
9h37m

I think its implementation is a fork of a 2006-era Mono (a clean room reverse engineered .NET implementation). After a few releases, they had a falling out with the Mono team over licensing, meaning they got stuck on an old release.

Besides that, they made a bunch of proprietary changes to Mono to make it run on their engine, and to be able to export to literally any platform under the sun.

A lot of platforms, like iOS and consoles have (or had) a strict no-JIT policy, so they needed to come up with statically compiling code to said platforms. One of the methods they used was IL2CCP, which turned .NET bytecode, into horrible looking C++, full of goto-s, weird labels and structs getting passed around.

Considering some platforms had limitations like you had to compile the game solely with the C++ compiler the platform supplied, not sure if they had a better solution, but it's still horribly hacky.

They've been manually syncing up changes from the more recent versions, but I can't really tell, at what pace.

But the thing is, even the official Mono has never really kept pace with MS's implementation, and recently, after the acquisition, Mono was dropped by MS in favor of the CoreCLR.

nvm0n2
0 replies
2h10m

Which platforms require you use a single C++ compiler?

neonsunset
0 replies
8h21m

At the moment, Unity is in the middle of the move to "vanilla" flavour of .NET aka CLR. Once they do, it should dramatically improve GC performance in Unity and enable an easier route for targeting AOT-requiring platforms like iOS (it is officially supported as a fully-fledged target for NAOT starting with .NET 8, and there's even rich interop directly with Swift in the plans).

maeln
1 replies
6h29m

> Unity is a clown engine. I remember some guy benchmarked DOTS, against plain old C++ code with OpenGL, just like your dad used to write, and it was no contest, DOTS couldn't keep up.

I don't think it is necessarily a fair or useful comparison. I also wrote several toy 3d engine a few years back, and some where performing better than some "commercial" engine, but it is not hard to do when you don't have to handle cross-platform, and a large number of feature. What is a fairer criticism is that Unity has been unable for the last decade to actually stick to a few features and make them actually mature and production ready. DOTS has been introduced like 4-5 years ago ? And it still feels like a polished tech-demo, not a production ready-system, properly integrated with the rest of the stack. And the same could be said about so many Unity feature. The "legacy" systems are just plain simply more reliable than almost any new feature they started for their replacement.

torginus
0 replies
5h44m

I don't think we disagree on much. Unity has failed to stick the landing on most features developed in the past decade. If you make small to medium sized games, and treat it like 2015-era Unity, it still works decently, but progress since then has been slow going. All the new-ish features since then, like Scriptable Render Pipelines, Networking and DOTS had a rough development path, with constant bugs, breakages and stillborn features, sometimes packages got deprecated before reaching production status.

As for the C++/OpenGL comparison, I could've just recommended any common sense approach using industry standard tools, like Unreal.

I'm just saying, in terms of code performance, DOTS promised a paradigm shift, this cool new way of structuring code and writing a dialect of C# promised heretofore unseen heights of performance. The reality was that Unity Burst code is barely faster than CoreCLR .NET code, and is slower than straightforward no-frills C++ (but at least you get Unity vendor lock-in). This performance matters when you're trying to code a core game system, like procedural shattering, CSG, or inverse kinematics, doing it in regular C# will bite you in the butt. I'd rather do C++ for the few things where performance matters (not to mention, I'm 99% sure that well-tested high performance open-source C++ implementations of the above problems exist), than let Unity take me for a ride with DOTS.

Just take a look at Unreal. I'm pretty sure most of the stuff Unity's still working on wasn't a problem circa UE4 release@2014, and most of their 2014 demos would work in a modern version of Unreal (not sure how much of a breaking change UE5 is, but definitely in the latter UE4 releases).

simonebrunozzi
11 replies
22h54m

I don't play videogames, but I am familiar with famous titles like this one.

I would have loved to see something really bold, such as Cities set in Venice, and the task is to revamp the city. More interesting, instead of cars, cars, cars...

solardev
3 replies
22h50m

You might be more interested in the Anno series then, of which 1800 is the most recent release: https://en.wikipedia.org/wiki/Anno_1800

It's set in the early industrial era, so a lot of sails and rails and small colonies exporting things all over. I can't remember if there are cars yet, but if there are, it's not a major focus. Some of the levels have you revamping existing towns, while others let you start from scratch.

(Edit: Anno 1800 is actually free to play on Steam this weekend: https://store.steampowered.com/app/916440/Anno_1800/)

An older title, Anno 1404, had a Venice expansion too: https://en.wikipedia.org/wiki/Anno_1404#Expansion

All of them are playable on Ubisoft+ for $15/mo (I think?), so it's a pretty low-risk investment if you want to try them. If you don't have the hardware, you can also stream them on GeForce Now and, I think, Amazon Luna.

Games are super accessible these days!

notahacker
2 replies
22h39m

I'd love to see something with the flexibility and sandbox nature of Skylines and the historic themes of the Anno series (which has pretty graphics but is ultimately tied to a grid and more about sticking buildings on the right tiles to maximise efficiency than worldbuilding)

solardev
0 replies
22h1m

PS: 1800 is free this weekend, if you wanna try it https://store.steampowered.com/app/916440/Anno_1800/

solardev
0 replies
22h31m

Have you tried 1800? The grid there is a pretty "soft" one, meaning it's more like a visual guide than anything enforcing gameplay. It's not that different from zoning and road-building in Skylines, where your beneficial buildings (like hospitals and police stations and schools) also have a limited radius of effectiveness.

Gameplay vid with no commentary, if you wanna see: https://youtu.be/jxm_ZroHj3E?si=qDnn1AK1d2kdVAf-&t=68

The campaign might be linear, but the sandbox mode feels like a mix of Cities and Civilization to me. It kinda scratches that sandbox itch, though the focus on trade (vs city-building) got a bit tiresome for me.

I think CS:2 also copied some of the mechanics from 1800 (like the customizable placement of farms around production buildings), trade buildings for lumber, ore, etc.

mrkeen
3 replies
22h48m

I heard about city simulator games modelling the number of carparks they'd need before deciding to scrap them instead.

If they left them in, it would have been a fun challenge to minimise them.

https://www.theverge.com/2013/5/9/4316222/simcity-lead-desig...

solardev
2 replies
22h29m

Parking and traffic is actually a big deal in Cities: Skylines too, both 1 and especially 2. Traffic will make a lot of areas harder to get to.

The second game adds parking lots of various sizes, parking structures both underground and above ground, etc. But it also encourages you to build alternative public transportation like buses, trolleys, trains, metros, etc. (Sadly no bicycles in the 2nd game yet).

chc
1 replies
22h1m

Cities: Skylines was sort of a follow-up to Cities In Motion, which was entirely about transportation, so that's a very essential part of its DNA.

Tijdreiziger
0 replies
21h2m

As someone who played a lot of CiM, CiM2 and C:S: they’re not as similar as you might expect.

The CiM games were all about public transit, cars played a supporting role at best. IMHO, they’re the best transit games ever made.

In C:S, it’s the other way around: managing car traffic is imperative (to the point that many recommend micromanaging intersections with mods such as TM:PE), and transit takes the supporting role (although it’s still a lot of fun).

GaggiX
1 replies
22h22m

No idea why you were downvoted, there are big limitations about this game: the zoning made of squares favors grids, lack of bike infrastructure, no progress has been made on this, so yes cars and grids unless you want dead areas.

squeaky-clean
0 replies
11h15m

Because the game they're asking for isn't a city builder, it's a city fixer. It's like the difference between playing the FIFA series and the Football Manager series.

Your comment ignores the primary point of their comment, they want to start with an existing city and fix it.

candiddevmike
0 replies
22h50m

There's a game kind of like that, maybe, called Cities in Motion (also published by Paradox...) where you build mass transit for existing cities. Buses, subways, trams, etc. It's narrow focus is pretty fun, while there isn't a Venice city, it's about revamping existing cities kinda.

MagicMoonlight
11 replies
22h8m

100,000 vertices for a pile of logs and 10,000 for teeth inside a characters head is hilarious.

It blows me away how bad everyone is at their jobs. Imagine spending all day working on something and then you just make it garbage.

4gotunameagain
7 replies
21h54m

Now that's a bit harsh. You don't know what is going on inside that company, and under what environment the development happened.

This all could easily stem from a couple of key people leaving and chaos breaking loose, or from extreme time pressure by the publishers.

MichaelZuo
6 replies
21h38m

The default assumption is that the studio is not particularly different from the norm of the industry, unless proven to be otherwise.

clnq
5 replies
21h25m

Colossal Order has about 40 employees, even though they are AAA. Here - proven otherwise.

MichaelZuo
4 replies
16h56m

> even though they are AAA

Says who?

clnq
3 replies
13h50m

They have a large publisher (Paradox Interactive) who prescribes their schedules. They also made bet bookings from Cities: Skylines that put them quite evidently in AAA. Finally, they release the games for all major platforms. All together, this describes a AAA game developer.

In contrast, an AA developer would work with publishers like Annapurna Interactive, Devolver Digital or Team 17, while a III game would have a much smaller scope - like The Witness, The Stanley Parable, or Hellblade: Senua’s Sacrifice. It is clear that Colossal Order doesn’t work with a small publisher or make games of indie scope.

If you were just sealioning, don’t do that please.

fulafel
1 replies
8h42m
clnq
0 replies
4h0m

That’s a very good resource, nice find! I think we can agree CO is between AA and AAA. I think it leans much more on the AAA side from my industry experience, but arguments could be made that it’s not all the way there and that therefore it’s AA. But this is a bit semantic. They make AAA money and have AAA standards and marketing. And that’s a bit unusual for 40 people, of which 10 are probably admin and 30 are dev.

MichaelZuo
0 replies
8h48m

Those don't seem to be strong reasons for categorizing Colossal Order as a AAA studio?

And what is 'sealioning'?

praptak
0 replies
21h38m

My humble experience is that this scale of duckup is unachievable from line workers being bad at their jobs.

It is easily achievable from bad/misaligned incentives, poor leadership, no product vision and probably a dozen other organisation problems which make decent workers working on stuff that actively makes the end product worse. Think Boeing 737 Max.

misnome
0 replies
21h38m

To the contrary: I’m sure all the developers worked very hard, and very competently.

It sounds like classic mismanagement. Some artists making this being told that there will be some automatic culling or LOD system so to go wild - it won’t affect the end result; and the system not being ready or being cut by another part of the organisation without the artists ever knowing about it.

I’m sure there were vocal developers who understood the problems and advocated for fixing - but a decision was made to release anyway; I can’t even say wrongly, because games being half-finished on release and polished later is not at all unusual nowadays even for flagship titles; and they do have a track record of supporting their titles for a long time.

I can well imagine a reasonable decision to get money coming in now for the cost of a couple of months of low level complaints that nobody will remember in a year.

It sucks, but I am willing to bet it’s not laziness or people being bad at their jobs.

Geee
0 replies
19h48m

Lmao. This is often how everything seems from the outside, but the inside story is something else. Probably just too much work, not enough time.

theNJR
10 replies
21h51m

While the complaints about performance are valid, I’m still having a blast with the game. Having an 11 month old means it’s hard to go deep into something like Cyberpunk since I only get short gaming bursts (ie 7:30 when she goes to bed until 8:30 when I turn off screens). Not enough to play a deep narrative game but plenty of time to expand out my industrial area, fix a highway interchange and figure out what’s going on with the tram line.

bigstrat2003
4 replies
21h19m

The complaints are valid to some extent, but also overblown too. People complaining that 30-50 FPS is unplayable need to get some perspective on what is and is not playable. And even the article here drops some hot hyperbole when it says that the game runs worse than CP2077 with max settings and path tracing. I've run (tried to run) CP in such a configuration, and I get framerates in the teens. By contrast I haven't actually bothered to measure the framerate in CS2 because it's perfectly smooth for me.

I'm all for holding developers accountable for flawed games, but the level of negative hyperbole around CS2 has been a real stain on the community.

sundvor
1 replies
18h40m

I just got CP2077 along with a new system, where I experience frame rates regularly north of 150 - almost always north of 100 with PBR, everything maxed out. It runs incredibly smooth 100% of the time, and looks completely stunning on my 32" 2560x1440x144 monitor. Specs are 4090/7800x3d, 64gb 6000c30, 990 pro nvme, bought mostly for being able to run DCS World (and iRacing) on triple screens plus Star Citizen. The 4090 is beast, and absolutely worth it.

I haven't had time or perhaps motivation to load up CS2 much, with that superb Cyberpunk story to be explored (and planes to fly), but on the initial tutorial I noticed a weirdly low fps for what was not a super impressive image.

I installed Skylines 2 through my gamepass; my initial thoughts were to come back after some post release patch cycles.

It took the CP2077 team a lot of time but they completely turned a trainwreck into something rather magic, so I'm hoping Skylines 2 will experience the same. I did enjoy the original release years ago. (Edit: Gamers Nexus' video led me to City Planner Plays, which shows just what can be made - the scope of Skylines 2 looks amazing, given my hardware I could probably get into it sooner rather than later!)

Finally a kudos to the author for this in depth, well written article! I really enjoyed it.

sundvor
0 replies
8h18m

FWIW I put a few hours into building a starting city, and no real issues with performance now in that a game like this clearly doesn't depend on smooth frames quite like a flight simulator does. I think there was at least one major patch since when I first tried it, which might have helped. There's lots of detail and modelling, this could be a bit of a dangerous time sink. :-)

I could certainly see why my old system (5800x3d / 3070 / 64) doesn't play it quite as nicely for my son's bigger 3440x1440x60hz monitor though; it certainly made the 4090 busy.

paavohtl
1 replies
19h34m

The CP2077 comparison is not hyperbole - it is literally how badly this game performs (or at least performed on launch) on top-tier gaming hardware (namely RTX 4090). I linked a source with the quote.

Dah00n
0 replies
18h15m

>The CP2077 comparison is not hyperbole

Except it was hyperbole how Cyberpunk was criticised. Or it wasn't, but it was no worse than most other AAA game, but somehow not all publishers/developers are criticised equally. I completed Cyberpunk and had to reload once because of a bug. That is already much better than Starfield (and every other Bethesda game) and the performance was just fine on my old PC. Something about Cyberpunks massive criticism smells funny.

wilg
3 replies
21h45m

I'm enjoying it too. It's fairly similar to the previous one, and have encountered a few simulation bugs, but I'm happy with it. I'm able to run it okay at 4K.

slimsag
2 replies
21h37m

If I played the old one and enjoyed it, what's the pitch for the new one? e.g. why is it better? On the surface it kinda just looks like the previous one, but I haven't dug into it

solardev
0 replies
20h11m

The roads/transportation networks are better (baked into the engine more deeply now, such for roundabouts and multi-modal transportation) and the map is much bigger. But honestly CS:1 mods did a "good enough" job at addressing those shortcomings anyway, and CS:2 is missing a lot of the DLC stuff that the first one added. It's got a pretty minimal selection of buildings at this point.

I'd wait a few months/years if I were you. Personally I feel like CS:2 was more of an architectural rewrite (as in the simulation engine) was awesome future potential, but gameplay-wise, modded and DLCed CS:1 just has a lot more actual content.

I still enjoyed CS:2 a lot though, if only because it's been a hot minute since the first game, and I forgot how much I loved this genre.

capableweb
0 replies
21h33m

I've played a lot of CS1 (and recently, lots of CS2), here are the biggest improvements for me:

- The simulation is much deeper than before, not basically just statistics on a page

- The game plays slightly harder, more management needed in order to have a proper budget. But like in the first, that disappears once you have 100/200K citizens, as it's hard to fuck up the budget at that stage.

- The control of roads is a lot better, compared to vanilla CS1. Nowhere near modded CS1, but it'll easily get there with some time, the foundation of CS2 is a lot stronger and easier to extend

- Able to build bigger cities will less lag compared to CS1. I'm sure this will improve even more in the future. Going ECS I'm sure made a huge difference in simulation performance.

stouset
0 replies
21h21m

I’m also loving the game. There are a few issues, and performance could be better, but all in all it feels like a really nice entry with solid bones that should lead towards most of these issues being resolved with far less effort than—say—Cyberpunk.

lowbloodsugar
8 replies
20h59m

DOTS is the brain child of Mike Action. See his 2014 CppCon "Data-Oriented Design and C++" [1]. But Mike has left Unity, according to his twitter.

[1] https://www.youtube.com/watch?v=rX0ItVEVjHc

frogblast
6 replies
18h51m

Despite the original post talking about DOTS rough edges, I didn't see anything in that article that actually suggested DOTS was the cause: that would cause CPU overhead, but it seems like they simply have a bunch of massively over-detailed geometry, and never implemented any LOD system.

Maybe they could have gotten away with this with UE5's Nanite, but that much excessive geometry would have brought everything else to its knees.

iMerNibor
3 replies
18h36m

> Maybe they could have gotten away with this with UE5's Nanite

Exactly.

If unity actually delivered a workable graphics pipeline (for the DOTS/ECS stack, or at all keeping up with what UE seems to be doing) these things probably wouldn't be an issue.

frogblast
1 replies
18h32m

DOTS/ECS has nothing to do with geometry LODs. Those are purely optimizing CPU systems.

Even if DOTS was perfect, the GPU would still be entirely geometry throughput bottlenecked.

Yes, UE5 has a large competitive advantage today for high-geometry content. But that wasn’t something Unity claimed could be automatically solved (so Unity is in the same position as every other engine in existence apart from UE5).

The developer should have been aware from the beginning of the need for geometry LOD: it is a city building game! The entire point is to position the camera far away from a massive number of objects!

iMerNibor
0 replies
18h26m

To quote from the blog post:

> Unity has a package called Entities Graphics, but surprisingly Cities: Skylines 2 doesn’t seem to use that. The reason might be its relative immaturity and its limited set of supported rendering features

I'd hazard a guess their implementation of whatever bridge between ECS and rendering is not capable of LODs currently (for whatever reason). I doubt they simply forgot to slap on the standard Unity component for LODs during development, there's got to be a bigger roadblock here

Edit: The non-presence of lod'ed models in the build does not necessarily mean they don't exist. Unity builds will usually not include assets that aren't referenced, so they may well exists, just waiting to be used.

johnnyanmac
0 replies
10h49m

main issue is that DOTS and the whole HDRP/URP stuff started at about the same time, but the goals were completely different. So it would have been nearly impossible to get them working together while DOTS was a moving target. Devs already have multiple breaking updates from the alpha versions of DOTS, an entire GPU pipeline sure wasn't going to rely on that.

>Unity has a package called Entities Graphics

Well that's news to me. Which means that package probably isn't much older than a year. Definitely way too late for game that far in production to use.

oh, so they rebranded the hybrid renderer. That makes a lot more sense: https://forum.unity.com/threads/hybrid-rendering-package-bec...

I'm fairly certain the hybrid renderer was not ready for production.

lowbloodsugar
0 replies
14h38m

Oh sure. I did not mean to imply that. Sorry.

baazaa
0 replies
18h26m

The author's point is that poor support for DOTS meant the devs had to roll their own culling implementation which they screwed up.

tuna74
0 replies
19h0m

The same Mike Action that claimed that 30 fps games sold better than 60 fps games using a very questionable data set (excluding sports games etc).

bhaak
8 replies
22h20m

Is there any Paradox game that doesn’t have lots of obvious bugs and terrible UI at release? And only the former gets somewhat addressed over time.

I really wonder how they develop at that place. And what kind of QS they have.I think even applying a crude pareto would improve their games a lot.

Edit: I stand corrected. I wasn't aware that Paradox is also a publisher and even such a big company (over 600 employees!). Still makes you wonder how they go about their business.

adventured
2 replies
22h14m

> Is there any Paradox game that doesn’t have lots of obvious bugs and terrible UI at release?

It's not a Paradox game, they're the publisher. Colossal Order is the developer.

It's a small developer out of Finland, 30-50 employees.

Dalewyn
1 replies
21h14m

It has or will have twelve dozen DLCs or more, so it's a Paradox game.

getwiththeprog
0 replies
8h35m

I think that the Paradox model IS DLC. They definateley comprimise user experience through their money-grabbing, and this cash above quality bleeds through to the way their games are published. They blur the line between developer and publisher.

izacus
0 replies
21h11m

Most of them.

frozenfoxx
0 replies
20h38m

I mean unrelated since Paradox is the publisher, but Gauntlet ran like greased lightning, Helldivers ran like greased lightning, Magicka 2 ran like greased lightning…

…of course those were all built on Dragonfly/Bitsquid instead of Unity so that might be a clue about where the issue lies.

capableweb
0 replies
22h18m

Svea Rike II was pretty bug free, but it was released quite some time ago...

brainzap
0 replies
18h3m

Paradox just doesn't do testing. The CEO even admitted it.

Dah00n
0 replies
18h35m

>Still makes you wonder how they go about their business

In a way that sent them straight to the top sellers list on Steam. Sadly, today, it just doesn't matter.

Edit: spelling

olaulaja
6 replies
20h55m

For a bit of reference, a full frame of Crysis (benchmark scene) was around 300k vertices or triangles (memory is fuzzy), so 3-10 log piles depending on which way my memory is off and how bad the vertex/triangle ratio is in each.

paavohtl
4 replies
18h30m

Author here: I never bothered counting the total vertices used per frame because I couldn't figure out an easy way to do it in Renderdoc. However someone on Reddit measured the total vertex count with ReShade and it can apparently reach hundreds of millions and up to 1 billion vertices in closeups in large cities.

Edit: Checked the vert & poly counts with Renderdoc. The example scene in the article processes 121 million vertices and over 40 million triangles.

jiggawatts
2 replies
13h3m

“If you’ve used more triangles than there are pixels on the screen, you’ve messed up.”

I call this kind of thing the “kilobyte rule” because a similar common problem is shoving tens of megabytes of JSON down the wire to display about a kilobyte of text in a browser.

kgabis
0 replies
4h43m

Well, not anymore, Epic's Nanite is basically a triangle per pixel rendering.

Sohcahtoa82
0 replies
9h42m

I'd think today's problem is tens of megabytes of JavaScript to display a couple kilobytes of text.

westurner
0 replies
8h37m

> The issues are luckily quite easy to fix, both by creating more LOD variants and by improving the culling system

How many polygons are there with and without e.g. AutoLOD/InstaLOD?

An LLM can probably be trained to simplify meshes and create LOD variants with e.g. UnityMeshSimplifier?

Whinarn/UnityMeshSimplifier: https://github.com/Whinarn/UnityMeshSimplifier :

> Mesh simplification for Unity. The project is deeply based on the Fast Quadric Mesh Simplification algorithm, but rewritten entirely in C# and released under the MIT license.

Mesh.Optimize: https://docs.unity3d.com/ScriptReference/Mesh.Optimize.html

Unity-Technologies/AutoLOD: https://github.com/Unity-Technologies/AutoLOD

"Unity Labs: AutoLOD - Experimenting with automatic performance improvements" https://blog.unity.com/technology/unity-labs-autolod-experim...

InstaLOD: https://github.com/InstaLOD

"Simulated Mesh Simplifier": https://github.com/Unity-Technologies/AutoLOD/issues/4 :

> Yes, we had started work on a GPU-accelerated simplifier using QEM, but it was not robust enough to release.

"Any chance of getting official support now that Unreal has shown off it's AutoLOD?" https://github.com/Unity-Technologies/AutoLOD/issues/71#issu... :

> "UE4 has had automatic LOD generation since it first released - I was honestly baffled when I realized that Unity was missing what I had assumed to be a basic feature.*

> Note that Nanite (which I assume you're referring to) is not a LOD system, despite being similar in the basic goal of not rendering as many polygons for distant objects.

"Unity: Feature Request: Auto - LOD" (2023-05) https://forum.unity.com/threads/auto-lod.1440610/

"Discussion about Virtualized Geometry (as introduced by UE5)" https://github.com/godotengine/godot-proposals/issues/2793

UE5 Unreal Engine 5 docs > Rendering features > Nanite: https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Na...

Unity-GPU-Based-Occlusion-Culling: https://github.com/przemyslawzaworski/Unity-GPU-Based-Occlus...

ReactiveJelly
0 replies
18h52m

Sounds right. I remember seeing "1M Triangles" in the performance HUD and thinking, that's crazy, a million triangles. Probably very few shared vertices once you account for edge splits, billboards, etc.

cchance
6 replies
16h12m

i'm sorry was he benchmarking this at 5120x1440? like seriously do people really game at that resolution?

kristofferR
1 replies
15h32m

That's the best resolution, it's the one of 49" 32:9 ultrawide monitors like the Samsung Odyssey Neo G9.

Look at images of it from the top, that'll help you understand how immersive it is.

paavohtl
0 replies
7h22m

Author here! That's my exact monitor. I actually started writing a blog post about it earlier this year when I got it, but didn't manage to finish it yet. It also has its own share of technical issues, though overall it is the best monitor I've ever seen or used.

stefan_
0 replies
15h9m

He is explicitly not benchmarking, just exploring the render pipeline.

jeroenhd
0 replies
13h35m

To me, that seems like an absolutely amazing resolution for many games. Very rarely do you need more pixels in the vertical direction (most games will just end up rendering sky or ground) when left to right are all you really need.

That resolution is still fewer pixels than a simple 4k display, which modern games seem to drive at 60fps quite regularly if you buy beefy hardware. A graphical gore fest like Doom runs at over 100fps on 4k on the card the author has, so a mostly static city builder should also operate fine at a resolution like that.

ihuman
0 replies
16h2m

That's two 1440p 16:9 monitors next to each other

denkmoon
0 replies
16h6m

Yeah, it's referred to as super-ultrawide. There are a reasonable number of monitors out there with this resolution.

It's more for race/flight sim type stuff IMO but if you got it, why not play everything at that res provided it performs well.

butz
6 replies
20h48m

Now imagine performance, if this game could be written by a single person in assembly...

MagicMoonlight
2 replies
20h31m

Without the stupidly implemented front end draining the resources you could easily have millions of people in your city.

qwytw
1 replies
18h22m

Why do you think the UI is causing all the performance issues?

Nition
0 replies
16h33m

I think they mean the visuals as a whole.

solardev
0 replies
20h36m

Roundabout Tycoon™: Untoothed Edition

rasz
0 replies
20h34m

Code is not the problem here, assets are. Even Chris Sawyer had an artist assigned for Transport Tycoon, otherwise you would end up with programmer art.

Krssst
0 replies
16h55m

Release date: 2043.

The feature set is far from the same scale as RCT1.

CooCooCaCha
6 replies
22h40m

I hope these issues come from the game being rushed and not from a lack of rendering expertise.

Luckily it seems like there are pretty simple reasons for the poor performance so I'm hopeful they can at least do something even if they don't have a ton of rendering expertise.

capableweb
5 replies
22h27m

I think the guess in the article is pretty close to the truth, I've seen stuff like that happen countless of times. You make a bet on a early technology (Unity DOTS + ECS in this case) which gives you a lot of benefits but also, it's immature enough that you get a bunch of additional work to do, and you barely have time to get everything in place before publisher forces you to follow the initial deadline.

lamontcg
4 replies
21h55m

100,000 vertices for pile of logs isn't really a bad bet on tech, though. That is just piling vastly more onto any tech stack than it can handle, with nobody having the time or the political okay to do a perf pass through the code and put all these ideas on a diet.

But that means that everything is solvable. There's no need in this game for 100,000 vertices for a logpile, so that should be a relatively straightforward task to fix. And someone can rip out all the teeth and put "Principal Tooth Extraction Engineer" on their resume.

capableweb
3 replies
21h27m

> 100,000 vertices for pile of logs isn't really a bad bet on tech, though. That is just piling vastly more onto any tech stack than it can handle, with nobody having the time or the political okay to do a perf pass through the code and put all these ideas on a diet.

I can easily see this happening though.

Artist starts making assets, asks "What's my budget for each model" and engineering/managers reply with "Do whatever you want, we'll automatically create different LODs later" and the day gold master is being done, the LOD system still isn't in place so the call gets made to just ship what they have, otherwise publisher deadline will be missed.

munificent
1 replies
17h39m

Anyone in management or engineering who tells an artist they have no texture or mesh budget at all gets exactly what's coming to them.

johnnyanmac
0 replies
10h17m

That's essentially what Epic is trying to tell devs these days. Don't know if it will ever truly live up to such lofty goals, though.

CountHackulus
0 replies
18h41m

That sounds like exactly what happened. I've been in that position many times in games I've worked on and seen it happen.

honkycat
5 replies
20h21m

Unity has been stalling on it's DOTS and network stack re-implementation for like 5 years now.

There is no excuse other than leadership are cashing the checks and squeezing the juice out of the company until they close it, which would make sense looking at their semi-recent merger and poor behavior by the CEO.

Seriously, I was looking into Unity at the start of Covid while laid off, and DOTS was "around the corner" even THAT far back!

They still don't have an answer for a network stack, and now LOD is broken? LMAO.

Unity has been a dirty word for me for a number of years. This is the pay-off for dismissing people's concerns and insisting it will buff out eventually.

kristianp
3 replies
18h26m

When reading sections of the article about Unity's permanently experimental features, I was wondering why they didn't use a different engine (probably because their expertise is in Unity). Does Unreal for example have support for this kind of game?

Oh and I have to mention the cascaded shadow mapping: "taking about 40 milliseconds or almost half of total frametime. ". - 40ms is 25fps all by itself!

munificent
0 replies
17h32m

Unless you've worked in games, it's really hard to understand just how massively tied to the engine a game is.

Imagine you have a web app written in Ruby using Rails with data stored in Postgres. You have a few hundred tables, millions of rows. Millions of lines of Ruby, CSS, and HTML. Thousands of images carefully exported as JPEG, GIF, or PNG depending on what they contain.

Now imagine being told you need to simultaneously port that to:

- Not run on the web. Instead be a native application using the OS's rendering.

- Not be written in Ruby. Instead using a different language.

- Not use Rails. Instead, use a different application framework.

- Not use Postgres. In fact, don't use a relational database at all. Use an entirely different paradigm for storing data.

- Not use CSS, HTML, and raster images. Instead, everything needs to be, I don't know, vector images and Processing code.

That's about the scale of what moving from one game engine to another can be like. Depending on the game, it's isn't always that Herculean of a task, but if often is. A mature game engine like Unity is a programming language, build system, database, asset pipeline, application framework, IDE, debugger, and set of core libraries all rolled into one. Once you've picked one, moving to another may as well just be creating a whole new game from scratch.

honkycat
0 replies
17h53m

unreal absolutely does have support for this type of game. It is really a smattering of different loosely coupled tools that you can bring into the project, or roll your own. The star is absolutely the rendering engine and visual programming tools IMO.

Unreal really excels at action games, but you can absolutely implement custom camera RTS controls and such. Batteries included but replaceable.

The question "does X engine support this kind of game" is a bit.. off. You would be amazed at how many features these engines pack in. However: You would also be amazed at how NOT "plug and play" they are. It still takes a TON of effort and custom code to make a sophisticated game.

bluescrn
0 replies
17h31m

Unity's killer feature is its C# support. It's enabled small teams to do far more in less time that they could ever hope to do using C++, and avoiding some of the nastier types of bugs that C++ game developers have to deal with.

And if you've got a team of experienced Unity developers, some who've been working with C#/Unity for their entire game dev career, switching to C++/Unity isn't the most practical option. While many concepts are similar between engines, you're going to be back at the bottom of the learning cliff for a while.

It sounds like Unity isn't really the problem in this case, it's more about too many polygons, poor use of LOD, and sub-optimal batching (too many draw calls). It was probably more of a time pressure issue than a tech issue, as game development is usually a race against the clock.

johnnyanmac
0 replies
9h6m

>I was looking into Unity at the start of Covid while laid off, and DOTS was "around the corner" even THAT far back!

to be fair, DOTS 1.0 did "release" in 2022 I believe. IIRC the core came out and the biggest ommision was DOTS animation. But many core parts of DOTS has been deemed "production ready" for about a year now.

>They still don't have an answer for a network stack, and now LOD is broken

broken implies that Unity had a robust LOD system to begin with. That's always been very primitive since Unity's most successful projects were mobile or 2D.

But yes, the Netcode has been in absolute shambles for years. Very confusing since that is the backbone of mobile. I'm guessing there's some popular 3d party they defer to that integrates well with Unity.

ok_computer
4 replies
16h20m

This is my favorite video using the poor rendering in the game to make cool art effects. Full disclosure there may be an artistic license and post processing but it’s hilarious you can get these visuals in a modern game.

https://www.youtube.com/watch?v=FQ13OFmRF-4

LtdJorge
3 replies
14h25m

Wrong video? I think that one is using Mirror's Edge Catalyst

pierrec
0 replies
13h8m

That right there is the city from the original Mirror's Edge (2008). I can tell from having played it a lot. And what a great idea for a music video.

ok_computer
0 replies
3h29m

Wrong video on my part.

hombre_fatal
0 replies
12h43m

They are referring to the video's focus on low-detail game assets that are only rendered in the distance. Though apparently something the game in TFA sorely lacks.

tomcam
3 replies
22h52m

You know things are rough when…

    The game ran so poorly
    that Windows Game Bar
    refused to acknowledge
    there even was a framerate.
booboofixer
2 replies
22h47m

> The teeth are not the only problem

> this written article from PC Games Hardware (in German) or this video from Gamers Nexus (in Americanese)

Humor and a great blog design, made my day.

belltaco
1 replies
22h31m

Isn't GN Canadian? I mean North Americanese.

Arnavion
0 replies
22h18m

GN is in the US. Maybe you're thinking of Linus Tech Tips.

timeon
3 replies
21h18m

This is not about performance but I find the example to have unrealistic design.

The 'city' in the article has population of 1000. That would be village where I live - but it is drown in pretty wide roads and huge amount of parking places.

Filligree
1 replies
21h1m

Cities Skylines, somewhat infamously, only allows American city designs to work. If you want to build something European, you can't.

anticensor
0 replies
11h15m

Cities:Skylines 2 actually supports mixed use zoning and has fixed the parking problem. No more pocket cars.

ziddoap
0 replies
20h45m

A general rule of thumb I always used was to take take whatever the in-game population says and multiply it by 10 to get a closer match between the size/complexity of the city and what the population would more realistically be.

If I recall correctly, there was even a few mods that did a similar re-balancing of the population numbers in the first game.

rychco
3 replies
22h35m

Much like Cyberpunk on it's release, this will hopefully be de-listed from the various digital storefronts & all customers issued a refund in full.

asylteltine
1 replies
21h25m

Cyberpunk got delisted by Sony because it literally didn’t function. It was playable but buggy on pc

thrillgore
0 replies
20h41m

Cyberpunk was delisted because CDPR told consumers to request refunds.

capableweb
0 replies
22h28m

Unlikely. The game actually does run, even though it has performance issues and bugs. Cyberpunk barely ran on the PS4 when it launched.

gardnr
3 replies
16h55m

I really appreciate the writing style:

> This pass is surprisingly heavy as it takes about 8.2 milliseconds, or roughly about far too long, ...

promiseofbeans
2 replies
15h28m

I wish more people would write like this, sprinkling little bits of humor in a somewhat serious piece

getwiththeprog
1 replies
12h18m

You might then like RockPaperShotgun and The Register

promiseofbeans
0 replies
9h34m

Thanks!

coldcode
3 replies
21h52m

Any time you try to do something complex you have to start with the state of the art at the beginning of development time; like in this case, you might have to use something not quite ready for prime time, hoping it will improve enough during development. If it doesn't, you have to roll your own, which is often hard to do since you didn't start out that way. I worked on a MMO game engine 10 years after it was released years too early, it was still loaded with horrific code and architecture that was hard to correct or improve since we still had to ship regularly. The engine was designed (if you can call it that) in 1998 and was entirely too complex for that era.

If you only stick to the mature tech, then you may wind up being potentially unable to even produce your (advanced) game. It's always a challenging tradeoff. As long as the game is playable and you make enough to continue improving you might be OK; but of course you might go belly up before you fix it enough.

By the time CS2 makes it to Mac, it might be improved enough to actually play!

solardev
0 replies
20h52m

Everquest?

rasz
0 replies
20h57m

Not implementing any LOD is not 'state of the art', its 'let me just click here and import those assets we bought from middleware company as is with no processing or inspection'.

badindentation
0 replies
20h43m

Runescape?

ajkjk
3 replies
21h11m

Does the game have a campaign mode yet? I'm desperate. I loved C:S 1 for a few hours and then had nothing to do. Just need some kind of challenge mode and I'd be hooked for life.

capableweb
1 replies
21h4m

Campaign mode? It's a sandbox game, like the countless of other sandbox simulators, you make your own goals :)

Otherwise I guess you could ask people for the savegames of almost broken cities and try to rescue them?

ajkjk
0 replies
20h45m

Tons of sandbox games have campaigns. I think Roller Coaster Tycoon, for instance, was the perfect example.

solardev
0 replies
20h34m

Have you tried the mods yet? I had a lot of fun coming up with my own goals, like perfect happiness without cars (using only transit, bicycles, and walking paths), or flood-proof cities (the hydro simulation is pretty fun to mess around with), or rebuilding a city after an asteroid (or maybe fifty...) hit it...

flipgimble
2 replies
20h14m

One conclusion is that Unity's technology stack is more like a Tower of Jenga with pieces in random state of readiness and compatibility. More and more Unity comes out looking like a mismanaged mess that lost its way from the goal of "democratizing game development".

I'm guessing the studio was pressured to release on a hard deadline to make the publisher their promised profits. This game would have been a guaranteed first day purchase for me, but the bad press coverage now made me move on to other games in the limited time I have. So I may check back to see if the updates fixed the pefromance in 6mo or never. What I'm saying is that the person who decided on this release date likely made a long term strategic mistake for short term profit, or was forced to do so by another idiot up the chain.

MrLeap
1 replies
20h5m

> "Unity's technology stack is more like a Tower of Jenga" Unity has a RELATIVELY firm foundation, and lots of optional out buildings that are short jenga towers. It does take some bonecasting to identify which is which. I would characterize the use of DOTs in a product you mean to ship as _gutsy_. For my projects I'd be liable to first write a compute shader for anything they used DOTs for.

Despite all that, from my reading it doesn't look like CS:2 really got bit too badly for using it. Their perf issues are more broadly explained by poor LOD coverage, treating their addons like black boxes, and no occlusion culling. These are issues that no stack is immune to.

asdff
0 replies
19h5m

If your stack merely had "run the production code on a typical user environment" as part of the process, you'd have caught this ahead of release though.

cube2222
2 replies
22h23m

Tip for those wanting to play it: change resolution scaling from dynamic to constant.

I have a 3080 and it basically moves it from "unplayable 10fps in the main menu" to "works just fine, no issues in game" with medium-high graphics.

stouset
0 replies
21h30m

Or off, entirely. On my 3080 it seems to cause lots of rendering artifacts.

dawnerd
0 replies
10h14m

That’s not what fixes it. Disable motion blur and depth of field. Depth of field kills the menu in particular.

PeterStuer
2 replies
10h14m

So the game uses extremely detailed models, then fails to have an intelligent way of abstracting/culling away those things that will never make it into pixels, is that a fair summery?

Also, does that mean that easy fixes are available or is this so core that solutions would require going back to the drawing board?

sbergot
1 replies
8h45m

We can only speculate one the difficulty of the fixes. I am not an expert but I believe culling and LoD are two big things for engines. Not being able to use existing implementations is really unfortunate. I guess the "tricks" are well-known but I don't think the implementation is easy.

Ciantic
0 replies
6h45m

I'm not a GPU programmer, but I became interested to dig out what is the state of art of culling, and found out Arseny Kapoulkine [1] has implemented those in MIT licensed C++ code base:

Cluster cone culling, frustum culling, Automatic occlusion culling, triangle culling, Meshlet occlusion culling, Optimizing culling. On paper sounds impressive, and maybe Colossal Order could learn from those.

https://github.com/zeux/niagara

dimgl
1 replies
18h13m

Just wanna give a shout out to the brief mention in the article of Anno 1800. Quite possibly one of the best games I've ever played.

sundvor
0 replies
8h16m

I bought it based on this comment, on sale now at Steam! It looks great, thanks.

mrcwinn
0 replies
21h14m

I loved every bit of this post, especially the final few sentences. Thank you.

kookamamie
0 replies
8h44m

The real summary: Unity is used as rendering engine and various rendering options are toggled on without much worry about the big picture. Add in some Unity "packages" and the patchwork of rendering things with unpredictable (and hard-to-fix) performance is ready.

hooby
0 replies
8h16m

The TL;DR:

Analysis showed that most frame time was lost due to lack of LOD levels and culling - causing the engine to render insanely high amounts of polygons that weren't even visible on screen.

In conclusion it seems that using the brand new engine feature DOTS (which is a perfect match for this game), successfully solved the CPU bottlenecks the first game had.

But because of Unity's DOTS <-> HDPR integration still being WIP (with several key features being experimental/missing) - they had to implement a lot of stuff on their own (including, but not limited to culling), which cost development time and caused the game to be not quite ready at release (explains missing LODs/optimization).

hertzrut
0 replies
19h18m

Unity is hostile to serious game development

davbo
0 replies
5h8m

Laundry boys is a superb name for clothes pegs

Great article @paavohtl!

dang
0 replies
19h31m

Hey all: this is an interesting article. Can we please discuss what's specifically interesting here?

Threads like this tend to become occasions for responding generically to stuff-about-$THING (in this case, the game), or stuff-about-$RELATED (in this case, the framework), or stuff-about-$COMPARABLE. There's nothing wrong with those in principle but each step into genericness makes discussions shallower and less interesting. That's why the site guidelines include "Avoid generic tangents" - https://news.ycombinator.com/newsguidelines.html

cratermoon
0 replies
19h37m

To paraphrase, "Colossal Order's programmers were so preoccupied with whether or not they could, they didn't stop to think if they should"

bob1029
0 replies
21h36m

The whole industry seems in need of a major reset event (aka true competition).

We wasted a LOT of innovation tokens on VR, ray tracing, battle royales, etc. The availability of OSS and COTS engines has been an uplift on paper, but brought with it an entire new universe of downsides with regard to actual player experiences.

For better or worse, I believe that a higher barrier to entry is a good thing for a major creative effort like a video game or feature length film. Both of these typically require involving more than 1 human in a deep, passionate way. You really want to make sure you have the right vision & people or it's going to turn out shit.

Would you rather have 30 kinda meh games you can play for ~20-50 hours each, or 2 super incredible games that have endless replay value? At a certain point, the value proposition goes discontinuous with this form of entertainment. In my view, you should always seek this criticality and consider how severe the economics are if you fail to reach it. You can't necessarily plan to build something that will last as long as WoW from the beginning, but you can certainly ask yourselves "is this still fun to play?" on a daily basis.

I'm not saying we go back to the abacus, but there is a price to be paid for the level of abstraction we are operating with today. You stop thinking about things and they leak into the player feel. When you are working on top of a physics engine that you separately tuned for 1000 hours under extreme duress, you know exactly when something isn't quite right and can do something about it immediately. Every COTS engine is a very leaky abstraction that turns into a titanic disaster the moment you desire things like bespoke multiplayer or platform functionality.

I also think there is a vertical ownership crisis in the industry. When you have all of your assets being produced in a separate silo, you should expect a very manufactured look & feel when they are combined with the rest of the product. Less content would probably feel like a lot more if we'd slow down a bit and re-integrate the artists, developers, testers, etc. under fewer hats.

arendtio
0 replies
1h14m

I am surprised that the issues are so basic. When I heard, that Cities Skylines 2 had a bad performance I expected something like they were using the GPU for graphics as well as for the simulation, but didn't have a good scheduler or so (something new/unproven).

However, after reading this, it seems to be mostly a rendering issue with even basic stuff missing, like LOD models or occlusion culling?!? Feels like someone had to optimize the rendering engine after Unity didn't deliver, but skipped the '101 of how to optimize a renderer' in the first place.

akasakahakada
0 replies
11h55m

Let's see how much of these still hold up.

1. Optimization do not bring profit.

2. Bad code make developers' life easier.

3. Hardware nowadays is fast enough that we no need to concern about efficiency.

Unfrozen0688
0 replies
19h57m

Imo graphics peaked on PS2. All I need is RE4 / FFXII style graphics rendered in 4k resolution. Sublime.

Stevvo
0 replies
8h50m

Unity has built in profilers that show exactly what Renderdoc + NSight do, and far more. The blame lies squarely with the studio for failing to do basic optimization that is required for any game. The many issues with Unity's engine development and features are unrelated.