Let's talk about AI!

That’s because AI teaching itself hasn’t started yet (according to Musk it’s just started though), it’s still reliant on humans to train it. All the biggest tech companies are now forcing their developers to use/train AI so it’s going to accelerate, but once AI can train itself the growth rate in its reasoning will be exponential. By exponential I mean we as humans literally can’t comprehend how fast.

PS. If all youve ever used is Microsoft’s AI I’m not surprised you feel this way.

“I’m sorry Dave, I’m afraid I can’t do that,” entered my head.

4 Likes

Well, my team just got the AI bomb dropped on us. We’re starting our first “AI-first” project, and will work our way up to all new projects being AI-first…

4 Likes

Goes over the how to create a plan that generates the ideal input which will give you an ideal output. Must have said a dozen times if you’re getting AI slop then it’s your input that’s the problem not the model.

What are you unsure about? Whether the video Goes over how to create an input plan or whether the person in the video says If you’re getting AI slop the problem is your input?

With all due respect, a lot of the people here talking about how ineffective AI has been for them have decades of experience in this field while you have very little to none. They are also working on much more complex and large scale software/applications than those you’ve applied AI to create. I think you can take them at their word about the problems they’re experiencing with AI in their field instead of saying “the problem is you just aren’t using it right, trust me”.

4 Likes

I’m just telling you what the developers in the video with years of experience said. I didn’t say it.

also I haven’t seen any one here talking about doing what they are recommending in the video. I’ve only seen people with a bias against using it trying to implement it because they were forced to.

We built computers to do maths faster.

We created programming languages to write the computer maths instructions faster.

Now we have tools that supposedly make writing the faster faster thing faster again. The problem is that people already do that job very well and in a way that’s difficult to get a machine to do, because we created computers and so we designed them to be programmed by humans.

So yes, being a programmer will make you ‘biased’ against tools that do your job badly and act out inadvisable design choices. Learning how to write programs yourself is not hard. In fact, I’d say it’s easier than walking. If you learn Python today it can save you hours of manually renaming files, cropping and dating photos, allow you to make various graphs from arbitrary datasets, and change your life for the better in other ways you couldn’t have imagined. It’s a sharp tool; using it is a skill. Once you have some experience you can get very expressive, writing code so complicated and nuanced that you can’t even really put it into words anymore.

3 Likes

It would only “act out an inadvisable design choice” if you failed to tell it what design choice it should make. Thats the point they are making in the video.

Either you can tell it what you want it to do and it will do it in the best possible way, or you’re better off just writing the code yourself because you’d have to be so absurdly specific…

‘Uh no, I want you to use lazy evaluation for this list transform because the repeated reallocation is too expensive’

‘Actually this resource is being accessed from two threads so you need to use a mutex and lock and unlock it in a way that doesn’t cause data races every time the resource is used’

‘Your render setup churns too many resources, I need you to cache them upfront and only re-evaluate them every time any one of these 14 conditions are met which would incidentally change the resources’

‘Stop using AVX512 it’s outside of our target CPU feature set’

‘You’re using this resource after freeing it… and this other resource is never freed…’

All of this requires fully understanding the code you get which is slower than just writing it. Reading code that’s not yours is so much slower than writing or reading your own code.

In a codebase you also need to think about maintainability & expandability. You don’t want to be babysitting someone making sure they use a switch-case block instead of an if-else chain every time there’s a place that you know will need to be expandable in the future.

3 Likes

AI has an incentive issue. If your model is incentivized to make thing fast and cheap it’ll cut corners, if it’s to make me happy then it’ll just make up stuff.. lookin at you chat gpt. When the model is tweaked to prioritize something it’ll often have undesirable output and honestly allot of it is very human. The anthropic vending machine tells a valid story of AI is not really able to do anything more complex than entry level tasking at the moment without making a mess.

1 Like

Yeah the AI first stuff is wild because paying folks to review AI slop code is so much more inefficient than just having them write code and AI help review it for errors. Absolutely bonkers investor type of thinking

2 Likes

Honestly I’ve debated writing all this but my biggest issues with current trends in push everything to AI and the unsustainable growth is infrastructure and Moore’s law.

Moore’s law broke during Covid with the supply chain. we no longer have the exponential growth in compute and it’s starting to show. We are hitting walls in fabrication and manufacturing that simply have slowed advances to the point it’s mostly coming from software and hardware isn’t keeping up. We get more efficiency out of each core every other cycle of production but more power is simply coming at the cost of bigger more power hungry systems.

Now that brings us to power. Power in the US specifically hasn’t been updated in generations. We are very behind in infrastructure and the Atlantic seaboard is going to show this hard in the coming 12-16 months. At the current growth of the AI datacenter boom Virginia will not have enough power output to meet demand by the end of 2027 and the closest expansion project for more capacity and output are tracked for late 29… that’s going to hit a crux either start cutting power and rolling brown outs across the east coast because it’s an interconnected grid (or worse some states will isolate themselves from the grid like Texas to protect their own citizens creating even more disjointed and broken systems. Or limit the datacenter boom significantly either by cutting power to datacenters during peak hours forcing them to generators or just shutting down buildout projects and expansion if older non efficient datacenters to protect the grid which will impact the economy and internet hard.

Some might say build them elsewhere but the last big boom the internet boom if the early 2000s was the last time we built major infrastructure around data lines and that’s resulted In the major backhaul connectivity of the internet to all converge in 5 major epicenters around the US. Ashburn VA, New Jersey right across the river from manhattan, Seattle Washington, Dallas Texas, and San Jose CA. Those are the Points of presence (PoPs) in the US where the most major network connections cross. This is why ashburn VA has become the hub of the internet because over 70% of global internet traffic flows through my backyard.

Building infrastructure to put connectivity at a point it wouldn’t experience high latency is a massive multi billion and year project no one is willing to take on elsewhere so they keep building around the epicenter along the train tracks (most dark fiber was placed by train following tracks)

So yeah we have a power issue that’s coming to a crux very soon and either everyone in VA will need a generator or solar in the next two years or the boom will be stalled hard and impact the entire world giving many companies that made promises they probably can’t fulfill an out (open AI)

So then this takes us to training data. We ran out. It’s done. There’s nothing left on the internet that hasn’t already been sucked up besides some private companies holding on to pii or other valuable data that is hard to offload due to compliance or governance restrictions or hard copies of data that just haven’t been digitized yet or the very few holdouts with private repos of data that refuse to sell for whatever reasons.

So what are they doing. Synthetic data (ai generated training data) which is way slower to advance the models on and using platforms like meta, google and such to have us generate new data through our own behavior.

That’s why gpt5 wasn’t as exciting as they lead on. One they don’t have had to throttle compute as they needs more AI datacenters to meet demand forcing them to push as much as they can to the older less power hungry models until they can build out more because hardware no longer scales up in compute you just need more of it which needs more water and power. And the training data is now AI slop fed so it’s not nearly as impressive learning because AI hasn’t hit a point it’s teaching its self new things it’s just creating more complex and optimized datasets that need more human feedback to ensure it’s accurate.

There was a video a while back if an interesting scenario if AI taking over the world in a decade bit reality is we are hitting hard impactful limits. I work in spaces where in DC there are task forces dedicated to failsafes when we hit the 2028 power crisis and the hold up to add more output or capacity is politics so far chance that’s gonna get fixed.

So yeah there’s my rant and my soap box I’ll hop off of. My doomed view point that’s bound to get down played.

5 Likes

Honestly even if the real problems of power, data center expansion, and model training are solved and in a couple years AI can just do all of our jobs is that even going to be a good thing really? I’m sure my billionaire founder will keep paying my health insurance as a thank you for training his AI models…

3 Likes

I’ll watch this video on the clock tomorrow. I’m required to use AI, so trying to keep an open mind despite the problems. I can disagree with management and still do my best to enact their requests, that’s my life.

3 Likes

A bit late to this convo, but I saw a post on reddit that had a r/LinkedInLunatics vibe where a Manager level or C-Suite guy basically said, slop is going to be the new baseline because “[Speaking to managers, I believe] you’re going to fall behind while your company is denying ‘less than perfect’ PRs, another company is going to send ‘slop’ to prod” (Verbatim, but the vibe is conveyed appropriately)

Which, I mean, I guess makes sense, until you realize that eventually a human is going to have to look at the spaghetti code to fix a problem AI can’t understand.

2 Likes

Personally, do I think AI could write code decent without humans? Eventually. In the near future, yes, possibly even within our lifetimes. In the next few years? Doubtful. Unfortunately, this push for AI (AI-first) is driven by greed and profit, which will ultimately be it’s own downfall. The harder and faster they keep driving it, the worse it’s going to be. Then AI will burst, causing it to be regarded as nothing more than a Furby of the 2020s for some time. Then, it’ll eventually be picked up by the academic community and really flourish.

Edit: Welp :zipper_mouth_face:

4 Likes

I do not think that any professional is ‘anti-AI’. It would be great if AI could implement new code as well as humans; or better. I would love to be able to tell it what to do and have it work magically. Then I could move on to specifying incredible, complex software systems to do new things that we currently can only dream of today. The problem is professional developers can see that this is currently a pipe-dream. Non-professionals seem to have a different take. I, for one, would love to be proven wrong.

1 Like

Created another app today. It’s an app to assist with the Army Corps of Engineers Stream Quantification Tool. Took exactly 4 prompts to get it to this point. Total time invested approximately 8 minutes while watching Sesame Street with my one year old. Connects to the device gps for location, calculates the projected uplift for credit generation and exports everything in a pdf report or CSV. Do you really think you could make a better app in less time while watching tv with a toddler???

1 Like

I do not want to harsh your buzz; but these are trivial applications. I would hesitate to even call it an ‘app’. It is more like a website form to gather information for submission to a back-end for processing. There are any number of form building applications for the AI to copy and utilize for this function. There is nothing new here and there are no custom business rules or features for processing being done by your code. That is where we developers come in.

1 Like