1. Welcome! Please take a few seconds to create your free account to post threads, make some friends, remove a few ads while surfing and much more. ClutchFans has been bringing fans together to talk Houston Sports since 1996. Join us!

Elon's biggest problem @ Twitter - he's not funny at tweeting

Discussion in 'BBS Hangout: Debate & Discussion' started by SamFisher, Dec 2, 2022.

  1. rocketsjudoka

    rocketsjudoka Member

    Joined:
    Jul 24, 2007
    Messages:
    58,167
    Likes Received:
    48,333
  2. adoo

    adoo Member

    Joined:
    Mar 1, 2003
    Messages:
    11,788
    Likes Received:
    7,924
    this is from Bloomberg

    After a Delaware judge struck down Elon Musk’s $55 billion Tesla pay package, questions still loom about what it means for the company. The 2018 package was meant to “motivate and incentivize”
    an already super-rich Musk at a time when some wondered whether he was devoting enough time to Tesla. Others speculated back then whether he might depart the company or
    hire a CEO to replace him. This new drama is yet another overhang for Tesla’s already-slumping share price, complicates a separate effort by Musk to secure another huge pay package
     
    ROCKSS likes this.
  3. Space Ghost

    Space Ghost Member

    Joined:
    Feb 14, 1999
    Messages:
    18,099
    Likes Received:
    8,539
  4. KingCheetah

    KingCheetah Atomic Playboy
    Supporting Member

    Joined:
    Jun 3, 2002
    Messages:
    59,079
    Likes Received:
    52,746
  5. Amiga

    Amiga Member

    Joined:
    Sep 18, 2008
    Messages:
    25,040
    Likes Received:
    23,300
    We aren't talking about internet, but social media. Parent didn't know. I didn't know. Society didn't know. We only know relatively recently. You are pretty dismissive of the harms of social media, so I wonder if you know.

    Screen addiction (to TV) has been a concern for decades. However, one-way mass media such as TV is not nearly as harmful as unregulated social media. The algorithms of social media were designed to directly leverage your reward system, encouraging you to go back for more and more. One-way mass media lacks that capability, and you can't be directly attacked through that medium, unlike social media.
     
  6. Space Ghost

    Space Ghost Member

    Joined:
    Feb 14, 1999
    Messages:
    18,099
    Likes Received:
    8,539
    Yeah, Im not going to let you run away from this by claiming ignorance. Just because YOU didn't know doesn't mean it hasn't been widely discussed. That fact is shitty parenting wants to claim 'we didn't know', just like all the morons who smoked cigarettes for years claimed they didn't know it was harmful. At this point, perhaps these types of people are just ignorant people who need authorities to tell them what to think. If you know how Zuckerberg got started, as an 'intelligent' human, you should know how Zuckerbergs mind works. He is a complete trash of a human being and nothing he does is altruistic. Its all by design. Meta is an addiction and its algorithms are specifically designed to keep the person engaged as much as possible, regardless of outcome. Again, this is not some recent secret that has been exposed, just willfully ignorant people. MSM has been doing it for decades, just at a fraction of what Meta has been able to accomplish.

    Intentionally targeting children by social media apps needs to be stopped. That is different from trying to force social media apps from creating a 'safe space' with a government regulatory body determining what content is safe and what isn't. Unfortunately regulating the (legal) content is not a solution.

    I am not talking about TV's. I am talking about the cheap two way devices that everyone has on their bodies at all times.
     
  7. NewRoxFan

    NewRoxFan Member

    Joined:
    Feb 22, 2002
    Messages:
    55,794
    Likes Received:
    55,868
    Good thing elon musk is ramping up efforts...

    https://fortune.com/2024/01/27/elon-musk-x-100-person-content-moderation-office/

    Maybe he can hire back the content moderation team he fired in November...
    https://www.cbsnews.com/news/elon-musk-twitter-layoffs-outsourced-content-moderators/

    And he can continue to follow zuckenberg's lead...
    https://www.nytimes.com/2022/10/28/technology/twitter-elon-musk-content-moderation.html
     
  8. Amiga

    Amiga Member

    Joined:
    Sep 18, 2008
    Messages:
    25,040
    Likes Received:
    23,300
    Maybe you did know, but I guarantee most parents didn't. Most parents do not understand our own reward systems and how they could be hijacked by smart algorithms. Most weren't aware that social media was using that technique. Even those who are aware do not know how severe the impact could be.

    From the perspective of studies, we didn't 'know' until around 2020 when a number of studies were released on the harm. We also didn't know that Facebook knew until 2021. The media didn't really pick up on this until last year, with Congressional hearings starting to take place. Keep in mind that Facebook is still challenging these studies with their own 'internal' studies.

    Whatever the case, it is what it is, and the idea that parents can do this on their own is not realistic or practical. As I said, there is a cultural shift already - good luck taking devices away from teens. What is needed are tools and guardrails, and that likely won't happen or happen fast enough without legislating. The social media platforms have too much to lose here to act in the interest of kids.

    Ps. For my part, once again, I didn't know. I knew enough that the unknown warranted caution (I thought of it as a huge social experiment on kids), so I taught my kids to avoid social media and to be careful about them years ago. But that doesn't mean I was aware of the harm. I wasn't – I was simply guessing and erring on the side of caution.
     
    ROCKSS likes this.
  9. KingCheetah

    KingCheetah Atomic Playboy
    Supporting Member

    Joined:
    Jun 3, 2002
    Messages:
    59,079
    Likes Received:
    52,746
    Trust and Safety Center of Understaffed Indifference
     
  10. ROCKSS

    ROCKSS Member
    Supporting Member

    Joined:
    May 9, 1999
    Messages:
    7,418
    Likes Received:
    7,879
    I don't have kids and I don't do social media, but it seems like there is plenty of blame to go around but it starts with the parents teaching their kids about the pitfalls of social media. I do find it disingenuous to see Congress belittling the CEO`s as they did, they have culpability in this also, what has congress done to regulate this forum. The ceo`s are in this to make money for their shareholders , I am sure they all hate what's happened but what safeguards are they putting on their own platform...............hell, you have AI now and other countries sneaking onto the platforms to do bad things. I think this all starts with the parents, easier said than done I know, I can guarantee you if we had this when I was in HS I would not have listened to my folks, just like I didn't listen when they said "drugs are bad mmmmkayyyy"

    The genie is out of the bottle and there needs to be some serious regulations by congress and the platforms themselves, hell team up with congress and work together to solve the issue................it seems they both just want to sit back and have these useless debates and nothing changes, it's just political rhetoric where congress gets to act tough and ask these billionaires to change on their own
     
  11. Buck Turgidson

    Joined:
    Feb 14, 2002
    Messages:
    100,255
    Likes Received:
    102,325
    LOLOLOLLOL
     
    superfob likes this.
  12. Jugdish

    Jugdish Member

    Joined:
    Mar 27, 2006
    Messages:
    9,066
    Likes Received:
    9,567
    I hate to break it to you...
     
  13. ROCKSS

    ROCKSS Member
    Supporting Member

    Joined:
    May 9, 1999
    Messages:
    7,418
    Likes Received:
    7,879
    I was referring to FB, X, snapchat all that crap.....this and Linkden are my guilty pleasures.:D
     
  14. ROCKSS

    ROCKSS Member
    Supporting Member

    Joined:
    May 9, 1999
    Messages:
    7,418
    Likes Received:
    7,879
    I got a kick out of that to...........
     
  15. Space Ghost

    Space Ghost Member

    Joined:
    Feb 14, 1999
    Messages:
    18,099
    Likes Received:
    8,539
    This reminds me during the late 2000's, mainly republicans, started talking about mass surveillance programs however it was dismissed as right wing propaganda. By the time it was exposed (Snowden), mainstream claims they were just hearing about it. The same thing with Twitterfiles, mass censoring, and the extent of how much government was involved with social media. That said, I certainly do not trust the social media companies and the government to come up with healthy policies.

    Children under 13 should not be on social media or unsupervised on the internet. That is a parent decision, not a role of government to regulate.

    Algorithms should be open sourced. All social media should be required to clearly mark all promoted content. Social media should also be required to show a history of all ads exposed to a user, including the sponsor of the ad, the meta tags associated and the cost of the ad.
     
  16. Amiga

    Amiga Member

    Joined:
    Sep 18, 2008
    Messages:
    25,040
    Likes Received:
    23,300
    You're comparing illegal government mass surveillance to legal social media smart algorithms and user interfaces designed to hook everyone, especially impactful on kids (by the way, we still don't know the full ramifications of how it impacts the rapidly developing brains of children, which continue to develop throughout childhood and only settle down at around age 25). Additionally, social media is allowed to let hateful and harmful content, directed at everyone and especially impactful on kids, stand on their platform.

    Expanding on the same comparison:

    Should it solely be the parents' responsibility to restrict their kids' movements and actions to avoid NSA surveillance? Or should it also be illegal, holding the government accountable and making abuse of power less likely? Similarly, should it solely be the parents' responsibility to restrict their kids' actions (like staying off social media) to avoid manipulation from such platforms and potential personal attacks by users? Or should it also be illegal to manipulate kids or allow no restrictions on personal attacks, holding social media accountable and reducing the likelihood of harm to children?

    ps. Social media is the all evil. It absolutely has many pros and we want to keep that while reducing the cons.
     
    #1136 Amiga, Feb 1, 2024
    Last edited: Feb 1, 2024
  17. Space Ghost

    Space Ghost Member

    Joined:
    Feb 14, 1999
    Messages:
    18,099
    Likes Received:
    8,539
    No, I am comparing two examples of willful ignorance. Look, I get it. You need two decades of studies and some institution to tell you that its bad. And then you expect the same institution to regulate that subject. Many people are like this.
    Meanwhile back at the ranch, there are whole movements who are fully aware of the destructive nature. No, they don't have billions of dollars of research and complex studies to specifically state why its bad. But a first clue should be when taking away the device from a child and they go into a raging fit of anger. It doesn't take a great deal of intelligence that a person, whether it be a child or adult, holding a screen in front of their face for prolonged periods of time is a good thing. Advertisement manipulation goes back decades. And to think a corporation is going to do the responsible thing is at best naive.

    Bottom line is that companies can not hire enough babysitters to sit around and monitor the amount of traffic on the internet. You're treating social media as an entitlement. Its absolutely wrong for these platforms to target children however that is a different conversation about trying to create a digital safe place. If parents like yourself insist on a digital safe place, then pay a company to create a digital playground where it can be monitored.

    I dont have problems with content or people because I self censor. Muting does wonders.

    no, its not evil. However the redeeming qualities are far and few.
     
  18. Commodore

    Commodore Member

    Joined:
    Dec 15, 2007
    Messages:
    33,550
    Likes Received:
    17,509
    Notice how the left is using the courts to confiscate the wealth of all their political enemies?

    Want to know why Bezos never says anything and gives the Washington Post a blank check and free reign to print whatever the regime wants? So he doesn't get targeted like Elon.
     
    AroundTheWorld likes this.
  19. Amiga

    Amiga Member

    Joined:
    Sep 18, 2008
    Messages:
    25,040
    Likes Received:
    23,300
    You're essentially advocating for personal and parental control and responsibility, whereas I argue for a dual approach - both parental and social media control and responsibility. I fully agree with your points on parental control. However, where we differ is that I also insist that companies profiting from kids should ensure their safety. It's not an unreasonable expectation. I firmly believe they bear both moral and social responsibilities in this regard. This concept isn't novel; when harm is significant, legislation has often expanded responsibility beyond just parental control. For instance, places selling alcohol are legally mandated not to serve it to minors.

    Addressing your concern about the challenge for companies - allocating a portion of company profit to protect kids seems feasible. With the advancements in AI, moderation and detecting inappropriate content aimed at kids can be efficiently handled by AI at a fraction of the cost compared to human. The larger issue lies in companies pushing inappropriate content to kids, but they have the capability to cease such practices if legislation mandates it.

    Regarding your other points - I don't see it as willful ignorance, as that implies knowing and intentionally ignoring. While we had some clues, we weren't aware of the secret NSA mass surveillance or the use of advanced psychological techniques by FB and other social media to 'hook' people.

    Concerning my point about the decades of studies - it's about the unknown upside risk. We can either choose to overlook it or adopt a safer approach, which aligns with my advocacy for both parental responsibility and social media company responsibility.
     
  20. Space Ghost

    Space Ghost Member

    Joined:
    Feb 14, 1999
    Messages:
    18,099
    Likes Received:
    8,539
    A pragmatic approach is for a social media site, or all sites for that matter, to label their site on their content moderation policies. For example, if a social media site wants to be labeled 'all ages', then let the government regulate it however they see fit. However if a site is labeled as Adult Content, the site should be allowed to freely do what it wants provided it falls under the laws.

    Ultimately across the board regulation leads to abuse. We clearly know that anything can be labeled misinformation. I am firmly against this new trend of removing the responsibility from the adult and placing it on society/government.

    Companies abusing children is a different discussion from the government regulating society.

    What we need is more transparency from these social media sites. This is why I stated algorithms should be open sourced and advertising meta data should be made available to the user. If this would happen, a lot of this trash would be fixed via the backlash.
     

Share This Page