• 0 Posts
  • 17 Comments
Joined 11 months ago
cake
Cake day: August 14th, 2023

help-circle
  • I think this is a picture before the most recent expansion. (They saw this picture and said “hmm not wide enough, too congested.”)

    In the normal parts:

    • 2 express/toll/HOV/carpool lanes
    • 5 regular highway lanes
    • 3 feeder lanes (in Texas, the highways tend to have “feeders” or “service roads” or “frontage roads” that run parallel to highways so that people can exit and enter, turn onto intersecting roads, and access local businesses, and Houston calls them “feeders”).

    That’s 10 in each direction. But at any given time there might be merge lanes between the express and the regular lanes, between the highway and the feeder, or between the feeder and a turn lane. So at the widest point, around the major freeway intersection with another huge toll highway, they bump it up to one more of each type of lane, for 13 lanes in each direction.

    There’s also a fair debate about whether the feeder lanes should count. After all, they have traffic lights and intersections to deal with. But on the other hand, driving on them is necessary to get on and off the highway lanes, so in a sense it’s part of the same highway.


  • Yeah, timestamps should always be stored in UTC, but actual planning of anything needs to be conscious of local time zones, including daylight savings. Coming up with a description of when a place is open in local time might be simple when described in local time but clunkier in UTC when accounting for daylight savings, local holidays, etc.





  • I’d say the real world doesn’t reward being actually gifted.

    More accurately, the real world punishes being below average at any one of like a dozen skillets. You can’t min/max your stats because being 99th percentile at something won’t make up for being 30th percentile at something else. Better to be 75th percentile at both.

    The real world requires cross-disciplinary coordination, which means thriving requires both soft skills and multiple hard skills.





  • It basically varies from chip to chip, and program to program.

    Speculative execution is when a program hits some kind of branch (like an if-then statement) and the CPU just goes ahead and calculates as if it’s true, and progresses down that line until it learns “oh wait it was false, just scrub all that work I did so far down this branch.” So it really depends on what that specific chip was doing in that moment, for that specific program.

    It’s a very real performance boost for normal operations, but for cryptographic operations you want every function to perform in exactly the same amount of time, so that something outside that program can’t see how long it took and infer secret information.

    These timing/side channel attacks generally work like this: imagine you have a program that tests if variable X is a prime number, by testing if every number smaller than X can divide evenly, from 2 on to X. Well, the bigger X is, the longer that particular function will take. So if the function takes a really long time, you’ve got a pretty good idea of what X is. So if you have a separate program that isn’t allowed to read the value of X, but can watch another program operate on X, you might be able to learn bits of information about X.

    Patches for these vulnerabilities changes the software to make those programs/function in fixed time, but then you lose all the efficiency gains of being able to finish faster, when you slow the program down to the weakest link, so to speak.


  • This particular class of vulnerabilities, where modern processors try to predict what operations might come next and perform them before they’re actually needed, has been found in basically all modern CPUs/GPUs. Spectre/Meldown, Downfall, Retbleed, etc., are all a class of hardware vulnerabilities that can leak crypographic secrets. Patching them generally slows down performance considerably, because the actual hardware vulnerability can’t be fixed directly.

    It’s not even the first one for the Apple M-series chips. PACMAN was a vulnerability in M1 chips.

    Researchers will almost certainly continue to find these, in all major vendors’ CPUs.






  • Notice that your comment is framed from the perspective of what Libertarians believe, and analyzing from that context. Mine is different: analyzing a specific type of personality common in tech careers, and analyzing why that type of person tends to be much more receptive to libertarian ideas.

    I’m familiar with libertarianism and its various schools/movements within that broader category. And I still think that many in that group tend to underappreciate issues of public choice, group behaviors, and how they differ from individual choice.

    Coase’s famous paper, the Theory of the Firm, tries to bridge some of that tension, but it’s also just not hard to see how human association into groups lays on a spectrum of voluntariness, with many more social situations being more coercive than Libertarians tend to appreciate, and then also layering Coase’s observations about the efficiencies of association onto involuntary associations, too.

    Then at that point you have a discussion about public choice theory, what the group owes to defectors or minority views or free riders within its group, what a group owes to others outside that group in terms of externalities, how to build a coalition within that framework of group choice, and then your nuanced position might have started as libertarianism but ends up looking a lot like mainstream political, social, and economic views, to the point where the libertarian label isn’t that useful.


  • I think technical-minded people tend to gravitate towards libertarian ideologies because they tend to underestimate the importance of human relationships to large scale systems. You can see it in the stereotype of the lone programmer who dislikes commenting or documentation, collaboration with other programmers, and strongly negative views towards their own project managers or their company’s executives. They also tend to have a negative view of customers/users, and don’t really believe in spending too much time in user interfaces/experiences. They have a natural skepticism of interdependence, because that brings on extra social overhead they don’t particularly believe they need. So they tend to view the legal, political, and social world through that same lens, as well.

    I think the modern world of software engineering has moved in a direction away from that, as code complexity has grown to the point where maintainability/documentation and collaborative processes have obvious benefit, visible up front, but you still see streaks of that in some personalities. And, as many age, they have some firsthand experience with projects that were technically brilliant but doomed due to financial/business reasons, or even social/regulatory reasons. The maturation of technical academic disciplines relating to design, user experience, architecture, maintainability, and security puts that “overhead” in a place that’s more immediate, so that they’re more likely to understand that the interdependence is part of the environment they must operate in.

    A lot of these technical minded people then see the two-party system as a struggle between MBAs and Ph.Ds, neither of whom they actually like, and prefer that problems be addressed organically at the lowest possible level with the simplest, most elegant rules. I have some disagreements with the typical libertarians on what weight should be assigned to social consensus, political/economic feasibility, and elegant simplicity in policymaking, but I think I get where most of them are coming from.