I know memory is fairly cheap but e.g. there are millions of new videos on youtube everyday, each probably few hundred MBs to few GBs. It all has to take enormous amount of space. Not to mention backups.

  • Generator@lemmy.pt
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    Not only that but for each video on YouTube there are different versions for each resolution. So if you upload a 1080p video, it gets converted to 1080p AVC/VP9, 720p AVC/VP9, 480p… also for the audio.

    If you run youtube-dl -F <youtube url> you will see different formats.

    • Falmarri@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Does youtube actually store copies of each one? Or does it store 1 master copy and downsaple as required in real time. Probably stores it since storage is cheaper than cpu time

      • Generator@lemmy.pt
        link
        fedilink
        arrow-up
        9
        ·
        1 year ago

        If it converts every video in realtime it will require a lot of CPU per server, it’s cheaper to store multiple copies. Also the average video isn’t more than some 300MB, less if it’s lower quality.

        Anyone with Plex or Jellyfin knows that it’s better to have the same movie in both qualities (1080,720) the transconding to avoid CPU usage.

        It’s possible to have fast transconding with GPUs, but with high so many users on youtube that will require a lots of power and high energy prices, store is cheaper.

        • patsharpesmullet@vlemmy.net
          link
          fedilink
          arrow-up
          0
          arrow-down
          6
          ·
          1 year ago

          It’s transposed on the fly, this is a fairly simple lambda function in AWS so whatever the GCP equivalent is. You can’t up sample potato spec, the reason it looks like shit is due to bandwidth and the service determining a lower speed than is available.

            • patsharpesmullet@vlemmy.net
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              That response is almost 10 years old and completely outdated. I’ve designed and maintained a national media service and can confirm that on the fly transcoding is both cheaper and easier. It does make sense to store different formats of videos that are popular at the minute but in the medium to long term streams are transcoded.

              • mangomission@lemm.ee
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                Sure it’s old but the stats I posted in a lower comment show that at YouTube’s scale, it makes sense to store.

              • mangomission@lemm.ee
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                Do you have a source? My instinct is the opposite. Compute scales with users but storage scales with videos

                • SHITPOSTING_ACCOUNT@feddit.de
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  Consider two cases:

                  • the most recent MrBeast video receiving millions of views from all kinds of devices (some of which require specific formats)
                  • a random video of a cat uploaded 5 years ago, total view count: 3

                  Design a system that optimizes for total cost.

                • NewNewAccount@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  No source but I imagine the amount of videos must be outpacing the amount of users. Users come and go but every uploaded video stays forever.

                  • mangomission@lemm.ee
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    1 year ago

                    I think you might be underestimating how many users YouTube has! According to this, 720,000 hours per day are uploaded versus 1,000,000,000 hours are watched per day!

      • merc@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        It probably depends on how popular the video is anticipated to be.

        I remember hearing that something like 80% of uploads to YouTube are never watched. 80% of the remaining 20% are watched only a handful of times. It’s only a tiny fraction that are popular, and the most popular are watched millions of times.

        I’d guess that they don’t transcode the 80% that nobody ever watches. They definitely transcode and cache the popular 4%, but who knows what they do with the 16% in the middle that are watched a few times, but not more than 10x.

      • WhoRoger@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        In real time would mean more cpu usage every time someone plays it. If converted in advance, they only need to do it once with the most effective codecs.

    • AnonymousLlama@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’m keen to know about how large these source files are for YouTube compared to the 720/1080 quality ones were see on the front-end. I remember them using really impressive compression but that the bitrate was super low to keep the since small.

      If they’re reducing a 10m 1080p file from 400MB down to 40MB then that’s a good gain