I’ve never known cable providers of failures to broadcast live TV in its history. MASH (not live) amongst many others had 70-100+ million viewers, many shows had 80%+ of the entire nation viewing something on its network without issue. I’ve never seen buffering on a Superbowl show.

Why do streaming services suffer compared to cable television when too many people watch at the same time? What’s the technical difficulty of a network that has improved over time but can’t keep up with numbers from decades ago for live television?

I hate ad based cable television but never had issues with it growing up. Why can’t current ‘tech’ meet the same needs we seemed to have solved long ago?

Just curious about what changed in data transmission that made it more difficult for the majority of people to watch the same thing at the same time.

  • DontNoodles@discuss.tchncs.de
    link
    fedilink
    arrow-up
    1
    ·
    4 hours ago

    I’ve wondered for a long time if it is possible to use WiFi as broadcast/multicast? I mean, i understand that it won’t work out of the box, but if one was to write code from ground up (ie different TCP protocol) can it be made possible to, say, transmit video lecture from one WiFi router and for multiple mobile phones to receive and view it without individually being connected to the network. Kind of like how everyone is able to view the SSID of the WiFi node without being connected.

    Or is it a hardware level problem that I can’t wrap my head around. I have wanted to understand this for a long time but I don’t have a background in this subject and don’t know the right questions to ask. Even the LLM based search tools have not been of much help.