![](/static/66c60d9f/assets/icons/icon-96x96.png)
![](https://beehaw.org/pictrs/image/c0e83ceb-b7e5-41b4-9b76-bfd152dd8d00.png)
Because you are speaking english and you are focusing on China instead of somewhere your voice might be heard. What are we supposed to do, bomb their civilians so they stop treating their civilians poorly? lol come on dude
Because you are speaking english and you are focusing on China instead of somewhere your voice might be heard. What are we supposed to do, bomb their civilians so they stop treating their civilians poorly? lol come on dude
Yeah, sure clean all yards but start with your own.
Your statement here and your reaction to how the community has responded is ignorant. Not sure how self awareness is dehumanizing unless you fancy yourself a champion for corporations but okay.
Man I hate coming in and whatabouting but you do realize that the entire EV industry is committing some major atrocities against the people of the Congo right? Western corporations are ruthlessly vile down there. Everything you named and more.
How about we clean up our own yard before telling on the neighbor?
I see where you’re coming from but you aren’t really speaking for the majority on lemmy. We are more open to open source projects and linux around here.
Unfortunately, I also have to use windows for some things, but microsoft and windows 11 are hot garbage, just like your attitude.
It is not even close to a good enough reason. First of all, I don’t really give a shit about what other people do or don’t do on their computers. It is not my responsibility. Second, sneaking in their cloud solution isn’t the right move ever.
Let the user decide if they want it, enable it by default I don’t care, but don’t sneak it in like it’s a fuckin trojan lol
People on here love this kind of propaganda dude lol
Agreed on pretty much all counts but I love Audiobooks just as much as “analog” books lol
Well unlike your ass, I appreciate the nuance of a good performance. But i know what you mean.
GPU time, while cheaper than a voice actor, is still a bit spendy though. And you then you also have the various copyright/licensing “issues” associated with AI content, companies may be a bit hesitant to go all in on producing books like that. Makes more sense for someone like Amazon/Audible and less sense for someone like spotify.
Besides, most audio books exist already so that really only applies to newer titles.
For sure. Ive got 100s titles in my Audible catalog and most of them are over that threshold. It is s stupid system. I don’t use podcasts or audiobooks on spotify, I think they should stick to music lol
I should have used the words “want to” instead of “be able to”. It is a garbage company I am definitely not defending their business practices.
I’m not defending them, just saying that it’s foolish for an enduser to expect anything different when they already don’t pay musicians and that is the primary content on their platform.
I can pretty much guarantee the average user would complain way more about the quality of simple TTS than they would the time limit. It would likely be a much bigger PR issue for them. AI generated TTS would probably be good enough for most but that is just another cost.
Regardless, the licensing involved with book publishers wouldn’t allow them to just produce their own audiobooks like that. So it is not really as simple as “just a choice”.
Audiobooks are expensive to produce and have extra licensing associated with them. Even Amazon can only give out 1 credit for $15 a month. A single books costs anywhere between $10-$60 bucks. Its just unreasonable to expect spotify to be able to afford that when they already barely pay musicians.
Audiobooks are expensive to produce.
Spotify is awful when it comes to content creators, complaining as an end user is crazy though.
Nice, that’s awesome!
There are already cases of people pretending to be AI and people revealing dumb information revealing dumb info about themselves lol
AI bros are just NFT bros with an actual product.
If you just want to use a local llm, using something like gpt4all is probably the easiest. Oobabooga or llama.cpp for a more advanced route.
I use ollama with llama3 on my macbook with open-webui and it works real nice. Mistral7b is another one I like. On my PC I have been using oobabooga with models I get from huggingface and I use it as an api for hobby projects.
I have never trained models, I don’t have the vram. My GPU is pretty old so I just use these for random gamedev and webdev projects and for messing around with RP in sillytavern.
My bad, I didn’t think pointing out someones bad attitude was crossing the line.