I think GPT style AI is here to stay, and it’s now getting so cheap to run that anybody with a reasonably beefy machine can run it locally. With stuff like LoRA you can also combine existing trained models, so you don’t have to train these things from scratch either. Putting the toothpaste back in the tube isn’t really feasible at this point in my opinion. It’ll be interesting to see how human creativity develops along side this tech.
I personally disagree that this will kill human creative drive, I actually think this tech can allow a lot more people to express themselves. Lots of people have creative ideas but lack the skills to express them well. GPT allows anybody to easily bring their ideas to life and share them with others.
I do agree that there needs to be moderation over content in general, I’d argue that it’s not really an AI specific issue. Long before AI came to the scene there has been a problem with figuring out the quality of content and its validity. US famously uses the internet as a political weapon to push its propaganda and dominate the information sphere of other countries. AI is an amplifier and it lowers the barrier for content production.
You used to need a whole editorial stuff to pump out articles, and US has polished media companies whose sole goal is to churn out propaganda. This was difficult to compete with and to put out competing narratives. With tools like GPT it’s now much easier for anyone to put out their message and to make it look polished.
In terms of creativity, I’m going to point out that a lot of similar arguments were made when photography was invented. Artists derided photographers because they said it was too easy to create pictures with a camera. Eventually art adapted, and photography has evolved as an art form of its own. I expect similar will happen with AI assisted art.
I think there is a plateau to how far things can escalate in practice. The limitation is ultimately how much information humans can process throughout the day. It’s also worth noting that we’re already drowning in propaganda and junk information right now. I’m not sure that additional saturation of the information space will fundamentally change things.
And I agree with the unethical side of how models get trained, especially in the case of proprietary models. I would argue this is more of a capitalism problem than a technology problem though.
I imagine that in the end people are still going to specialize, and they will leverage these tools to automate a lot of tedious work in their professions. It’s never been the case that new sorts of technology and automation led to stagnation. The opposite is generally the case where there is a huge explosion in creativity and invention.
I do think that societies like China will make much better use of this tech than the west will though because there is a central direction over how the tech is used and what it’s directed towards.
Removed by mod
I think GPT style AI is here to stay, and it’s now getting so cheap to run that anybody with a reasonably beefy machine can run it locally. With stuff like LoRA you can also combine existing trained models, so you don’t have to train these things from scratch either. Putting the toothpaste back in the tube isn’t really feasible at this point in my opinion. It’ll be interesting to see how human creativity develops along side this tech.
I personally disagree that this will kill human creative drive, I actually think this tech can allow a lot more people to express themselves. Lots of people have creative ideas but lack the skills to express them well. GPT allows anybody to easily bring their ideas to life and share them with others.
Removed by mod
I do agree that there needs to be moderation over content in general, I’d argue that it’s not really an AI specific issue. Long before AI came to the scene there has been a problem with figuring out the quality of content and its validity. US famously uses the internet as a political weapon to push its propaganda and dominate the information sphere of other countries. AI is an amplifier and it lowers the barrier for content production.
You used to need a whole editorial stuff to pump out articles, and US has polished media companies whose sole goal is to churn out propaganda. This was difficult to compete with and to put out competing narratives. With tools like GPT it’s now much easier for anyone to put out their message and to make it look polished.
In terms of creativity, I’m going to point out that a lot of similar arguments were made when photography was invented. Artists derided photographers because they said it was too easy to create pictures with a camera. Eventually art adapted, and photography has evolved as an art form of its own. I expect similar will happen with AI assisted art.
Removed by mod
I think there is a plateau to how far things can escalate in practice. The limitation is ultimately how much information humans can process throughout the day. It’s also worth noting that we’re already drowning in propaganda and junk information right now. I’m not sure that additional saturation of the information space will fundamentally change things.
And I agree with the unethical side of how models get trained, especially in the case of proprietary models. I would argue this is more of a capitalism problem than a technology problem though.
I imagine that in the end people are still going to specialize, and they will leverage these tools to automate a lot of tedious work in their professions. It’s never been the case that new sorts of technology and automation led to stagnation. The opposite is generally the case where there is a huge explosion in creativity and invention.
I do think that societies like China will make much better use of this tech than the west will though because there is a central direction over how the tech is used and what it’s directed towards.