• 0 Posts
  • 3 Comments
Joined 2 years ago
cake
Cake day: September 28th, 2023

help-circle

  • My first instinct would be that it would equivalent to putting another celestial body the mass of the earth at the distance from the earth is from each portal. Since gravity is a wave, it, in theory, would affect a region beyond what would considered “around” the portals.

    So if you put one portal on the ground, and another 100 meters up, it would be similar to there being a second earth 100 meters from the surface of the earth, experienced by the entire earth (once the gravitational wave propagated.) How that would evolve over time is too complex for my basic understanding of physics, but a simulation of it would be a neat experiment.


  • I like the sentiment of the article; however this quote really rubs me the wrong way:

    I’m not suggesting we abandon AI tools—that ship has sailed.

    Why would that ship have sailed? No one is forcing you to use an LLM. If, as the article supposes, using an LLM is detrimental, and it’s possible to start having days where you don’t use an LLM, then what’s stopping you from increasing the frequency of those days until you’re not using an LLM at all?

    I personally don’t interact with any LLMs, neither at work or at home, and I don’t have any issue getting work done. Yeah there was a decently long ramp-up period — maybe about 6 months — when I started on ny current project at work where it was more learning than doing; but now I feel like I know the codebase well enough to approach any problem I come up against. I’ve even debugged USB driver stuff, and, while it took a lot of research and reading USB specs, I was able to figure it out without any input from an LLM.

    Maybe it’s just because I’ve never bought into the hype; I just don’t see how people have such a high respect for LLMs. I’m of the opinion that using an LLM has potential only as a truly last resort — and even then will likely not be useful.