Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Selfhosted
  3. Do any of you have a buttload of RAM sitting around?

Do any of you have a buttload of RAM sitting around?

Scheduled Pinned Locked Moved Selfhosted
selfhosted
40 Posts 27 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • souperk@reddthat.comS [email protected]

    Hi,

    I have a friend who is looking to run a few simulations he has implemented in python and needs around 256GB of ram. He is estimating it will take a couple of hours, but he is studying economics so take that with a grain of salt 🤣

    For this instance, I recommended GCP, but I felt a bit dirty doing that. So, I was wondering if any of you have a buttload of memory he can burrow? Generally, would you lend your RAM for a short amount of time to a stranger over the internet? (assuming internet acccess is limited to a signle ssh port, other necessary safeguards are in place)

    drunkanroot@sh.itjust.worksD This user is from outside of this forum
    drunkanroot@sh.itjust.worksD This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #28

    i have 8 gigs thats been living on my desk for the last 4 years

    1 Reply Last reply
    1
    • S [email protected]

      First, define what you are asking for.

      Do you want someone to send you a cardboard box full of RAM? Then forget it. Nobody would be stupid enough to lend that much expensive hardware to someone on the internet.

      Or are you asking for someone to let you run random code on their PC for a few hours? Then forget it. Nobody would be stupid enough to open "a single SSH port" to someone on the internet to run potential malware on their PC.

      That's exactly what cloud platforms are there for, and if you don't like google, get any other cloud provider.

      scrubbles@poptalk.scrubbles.techS This user is from outside of this forum
      scrubbles@poptalk.scrubbles.techS This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #29

      Seconded. If they can't optimize their code (which, I have never seen applications require 256 gigs of ram even in FAANG so I find that doubtful), then they need to rent a machine. The cloud is where you rent it. If not Google, then AWS, Azure, Digital Ocean, any number of places let you rent compute

      S R 2 Replies Last reply
      1
      • C [email protected]

        Needing that much RAM is usually a red flag that the algo is not optimized.

        R This user is from outside of this forum
        R This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #30

        *looking at the 14TB cluster I had running for 18 hours

        Yep, nobody would ever need that much memory

        C 1 Reply Last reply
        2
        • souperk@reddthat.comS [email protected]

          Hi,

          I have a friend who is looking to run a few simulations he has implemented in python and needs around 256GB of ram. He is estimating it will take a couple of hours, but he is studying economics so take that with a grain of salt 🤣

          For this instance, I recommended GCP, but I felt a bit dirty doing that. So, I was wondering if any of you have a buttload of memory he can burrow? Generally, would you lend your RAM for a short amount of time to a stranger over the internet? (assuming internet acccess is limited to a signle ssh port, other necessary safeguards are in place)

          R This user is from outside of this forum
          R This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #31

          Tell your friend to open source the algorithm.. Somebody will surely point at a easy optimization.
          100 others will just shit on your friend

          1 Reply Last reply
          1
          • scrubbles@poptalk.scrubbles.techS [email protected]

            Seconded. If they can't optimize their code (which, I have never seen applications require 256 gigs of ram even in FAANG so I find that doubtful), then they need to rent a machine. The cloud is where you rent it. If not Google, then AWS, Azure, Digital Ocean, any number of places let you rent compute

            S This user is from outside of this forum
            S This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #32

            Yeah, it's an economics student running something on python. I can guarantee that it's horribly unoptimized.

            1 Reply Last reply
            1
            • R [email protected]

              *looking at the 14TB cluster I had running for 18 hours

              Yep, nobody would ever need that much memory

              C This user is from outside of this forum
              C This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #33

              Wow, yea I think you win that contest lol.

              R 1 Reply Last reply
              0
              • C [email protected]

                Wow, yea I think you win that contest lol.

                R This user is from outside of this forum
                R This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #34

                To be honest, it was a very paralell process. I could do a fraction of the compute, needing a fraction of the RAM, but taking a shit ton more time.

                Also, theres no perfect machine for this use. I can have 3.5 times more RAM than needed, or start swapping and waste time.

                1 Reply Last reply
                1
                • C [email protected]

                  Needing that much RAM is usually a red flag that the algo is not optimized.

                  M This user is from outside of this forum
                  M This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #35

                  Nope. Some algorithms are fastest when a whole data set is held into memory. You could design it to page data in from disk as needed, but it would be slower.

                  OpenTripPlanner as an example will hold the entire road network of the US in memory for example for fast driving directions, and it uses the amount of RAM in that ballpark.

                  C 1 Reply Last reply
                  1
                  • M [email protected]

                    Nope. Some algorithms are fastest when a whole data set is held into memory. You could design it to page data in from disk as needed, but it would be slower.

                    OpenTripPlanner as an example will hold the entire road network of the US in memory for example for fast driving directions, and it uses the amount of RAM in that ballpark.

                    C This user is from outside of this forum
                    C This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #36

                    Sure, that is why I said usually. The fact that 2 people replied with the same OpenStreetMap data set is kinda proving my point.

                    Also, do you need the entire US road system in memory if you are going somewhere 10 minutes away? Seems inefficient, but I am not an expert here. I guess it is one giant graph, if you slice it up, suddenly there are a bunch of loose ends that break the navigation.

                    M 1 Reply Last reply
                    0
                    • souperk@reddthat.comS [email protected]

                      Hi,

                      I have a friend who is looking to run a few simulations he has implemented in python and needs around 256GB of ram. He is estimating it will take a couple of hours, but he is studying economics so take that with a grain of salt 🤣

                      For this instance, I recommended GCP, but I felt a bit dirty doing that. So, I was wondering if any of you have a buttload of memory he can burrow? Generally, would you lend your RAM for a short amount of time to a stranger over the internet? (assuming internet acccess is limited to a signle ssh port, other necessary safeguards are in place)

                      R This user is from outside of this forum
                      R This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #37

                      AWS has an r4.8xlarge 244gb ram with 32 vcores for $2.13 an hour If they can handle Linux. $2.81 an hour for windows.

                      1 Reply Last reply
                      4
                      • souperk@reddthat.comS [email protected]

                        Hi,

                        I have a friend who is looking to run a few simulations he has implemented in python and needs around 256GB of ram. He is estimating it will take a couple of hours, but he is studying economics so take that with a grain of salt 🤣

                        For this instance, I recommended GCP, but I felt a bit dirty doing that. So, I was wondering if any of you have a buttload of memory he can burrow? Generally, would you lend your RAM for a short amount of time to a stranger over the internet? (assuming internet acccess is limited to a signle ssh port, other necessary safeguards are in place)

                        P This user is from outside of this forum
                        P This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #38

                        Apply for compute time at a university cluster. It is free and usually easy.

                        1 Reply Last reply
                        1
                        • C [email protected]

                          Sure, that is why I said usually. The fact that 2 people replied with the same OpenStreetMap data set is kinda proving my point.

                          Also, do you need the entire US road system in memory if you are going somewhere 10 minutes away? Seems inefficient, but I am not an expert here. I guess it is one giant graph, if you slice it up, suddenly there are a bunch of loose ends that break the navigation.

                          M This user is from outside of this forum
                          M This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #39

                          I host routing for customers across the US, so yes I need it all. There are ways to solve the problem with less memory but the point is that some problems really do require a huge amount of memory because of data scale and performance requirements.

                          1 Reply Last reply
                          1
                          • scrubbles@poptalk.scrubbles.techS [email protected]

                            Seconded. If they can't optimize their code (which, I have never seen applications require 256 gigs of ram even in FAANG so I find that doubtful), then they need to rent a machine. The cloud is where you rent it. If not Google, then AWS, Azure, Digital Ocean, any number of places let you rent compute

                            R This user is from outside of this forum
                            R This user is from outside of this forum
                            [email protected]
                            wrote last edited by
                            #40

                            In computational physics, 256g of RAM can be small. I have ran things that big easily. Some colleges have needed TiB. If it is heavy matrix diagonalization it can get chunky of unoptimized.

                            1 Reply Last reply
                            0
                            Reply
                            • Reply as topic
                            Log in to reply
                            • Oldest to Newest
                            • Newest to Oldest
                            • Most Votes


                            • Login

                            • Login or register to search.
                            • First post
                              Last post
                            0
                            • Categories
                            • Recent
                            • Tags
                            • Popular
                            • World
                            • Users
                            • Groups