• 1 Post
  • 75 Comments
Joined 3 months ago
cake
Cake day: March 28th, 2024

help-circle





  • Holy shit. I get it! That’s a great explanation and I really appreciate your taking the time to type it all out. I’m glad we don’t have Lemmy medallions to award but, if we did, I’d give you one. I now see how a 100% reserve requirement, i.e., all deposits completely backed in cash, would entirely change banking.

    The only thing that feels weird to me is the virtual money the bank creates doesn’t seem go away once it’s paid back. For example, if a mini bank only had $1000 and lent $900 with a 10% reserve, they’d end up with $1900 once the loan is repaid (ignoring interest). Or does the $900 they lent create a -$900 for the bank that is cancelled through repayment?


  • I’ve been thinking about it and it still doesn’t make sense. I’m a scientist, not an economist, so it’s wildly out of my wheelhouse. Would you mind pointing me in the right direction?

    Here’s where I’m hung up. Let’s assume a 10% fractional reserve and, for the sake of simplicity, just one bank and a dramatically simplified deposit/loan scenario, just to minimize the number of hypothetical people and transactions.

    Person A deposits $1000. Bank lends $900 to person A which is sent to Person B.

    Person B deposits $900. Bank lends $810 to person B which is sent to Person C.

    Person C deposits $810. Bank lends $729 to person C which is sent to Person D.

    Person D deposits $729. Bank lends $656 to person D which is sent to Person E.

    Let’s stop there. So we have one initial deposit of $1000, which has resulted in an additional $2,493 in deposits ($3,493 in total) and $3,095 in loans. The bank is now receiving payments, plus interest, on over 3x the amount of actual money it was actually given. To me, it seems like the bank is figuratively “printing money” and gaining interest on it. Nothing I’ve read on fractional reserve lending has suggested this is incorrect.

    Halp!














  • Here’s my setup. We own a 65" LG G3 and I often game on it. We sit about 4 meters away. I’m a PC gamer and run a 3080. For what it’s worth, I have 20/10 vision (although the trade-off is I’m a bit far sighted).

    While I can see a difference between 1080p and 4k, it’s pretty minimal for both 3D and 2D, to the point where I have to be looking for it. When I’m playing a game, it’s not really noticeable at all. It’s a bit more noticeable up close but I never have cause to sit just a meter away. This has all been confirmed by my wife and multiple friends and family members who have checked out our setup prior to buying a new television or done the same comparison themselves.

    We even did a fun experiment (I’m a scientist so I love experiments). We ran the same game in 4k or 1080p with otherwise identical graphics settings, but with the resolution chosen at random by someone else, and tried to discern the resolution at our normal sitting distance while playing the game as we normally would. The results were equivalent to choosing randomly.

    Lastly, a friend was all for buying an 8k television prior to my 1080p/4k demonstration. They now have the even larger 77" version of my television and, at about 5 meters away, the difference is still pretty minimal.

    So I happily game at 1080p with the settings cranked up.