• 1 Post
  • 214 Comments
Joined 10 months ago
cake
Cake day: December 13th, 2024

help-circle



  • Here we go again problematizing the noun “female”. Some don’t mind. Seriously. And it’s used self-referentially “in-group”: it shows up in feminist book titles, in dating communities (eg, “F4F/M”), classifieds (eg, “need a roommate […] females only”), etc. In conventional language, it’s an acceptable word.

    The problem isn’t so much the word, but its usage, ie, the message. These superficial word criticisms fail to meaningfully engage the fuller context & meaning.

    Imagine we make the name for an entire class a derogatory word! Meanwhile, the name for other major gender/sex remains innocuous. Seems like classic stigmatization: who is that serving? Is opposition to the noun “female” unwittingly subscribing to stigmatization & sexist thinking of those who’d welcome the stigmatization?



  • Seems you don’t care about grandmas & gen z.

    forcibly doing this to every goddamn phone, phone manufacturer, and Android enthusiast

    They can manage.

    whenever Google decides that unregulated social media services like Lemmy are not family-safe I won’t have to listen to your malicious horseshit

    So casual users can get wrecked, yet I’m malicious? Maybe think of users other than yourself, weigh the potential losses to them by successful attacks, and consider whether OS designers have a legitimate claim in preventing exposure of known threats to casual users while still allowing power users to bypass those checks.

    You’re assuming I use an Android app (trash) to get on here, and not a proper workstation or web browser. You’re welcome to this “malicious horseshit” for eternity.


  • What percentage of them do you think has the capacity and capability to use ADB?

    All of them: they can follow procedures, plug a cable, and push buttons if they really want to. Most won’t bother: capacity isn’t willpower.

    it’s a pain in the arse

    That’s the idea: welcome to an effective deterrent.

    even I’m not going to do it to install a trusted open source app

    Good, then it’ll deter as designed.

    the only reason

    Nah, the use cases are legitimate:

    • It will actually deter installation of malicious software once it’s been identified & flagged that way in their system.
    • It also verifies install packages haven’t been tampered (possibly maliciously) from their original releases.

    Malicious software on devices connected to everything including highly sensitive information poses high-cost risks that you & casual users overlook because muh inconvenience 😭. If casual users can’t bother with a straightforward procedure as you say, then how prepared are they to handle the real challenges of a successful attack?

    From a security perspective, it makes sense for OS designers to choose to limit exposure to that threat to power users who can be expected to at least have a better idea of what they’re getting themselves into.


  • “They” is a pronoun.

    Not the question.

    It’s loaded language

    Nothing you wrote supports that. In the i-drive case, it draws a distinction between a (1) direct transfer between remote systems (without intermediary) and (2) a transfer between a local & remote system.

    Other OSs have this concept. My first exposure to the concept came from administering Windows systems. Their definition draws an unopinionated distinction between official & unofficial distribution channels

    Sideloading apps is when you install apps that aren’t from an official source, such as the Microsoft Store. Your organization can create its own apps, including line-of-business (LOB) apps.

    & their distinct installation methods with similar caveats

    When you enable sideloading, you allow installing and running apps from outside the Microsoft Store. This action might increase security risks to the device and your data. Sideloaded apps need to be signed with a certificate that the device trusts.

    That’s the entire point of the term there: to express that the installation method & checks differ.

    What about the clear use case for a FOSS developer who doesn’t want to go through the Google authority for validation?

    Sign it yourself or bypass verification as stated before.

    Show me the “clear documentation that power users can still install any package they want,”

    It was linked above: try reading.

    Google already has a service to protect against malicious applications

    which is reactive & doesn’t deter the installation of malicious apps via sideload like the new feature will.


  • Name-calling an opposing “they”

    “They” is now a dirty word?

    Sideload wasn’t loaded language before Android OS and still isn’t: it’s a bogus, overreactive claim.

    All of them are valid install methods. Developers will always need a way to load their experimental apps not yet suitable for release: they won’t block the methods they need to do that.

    Clear use cases for casual users exist for

    • deterring them from installing software by bad actors that’s known to be malicious
    • verifying non-malicious software hasn’t been modified possibly maliciously before installing it.

    “They” are drama-queens, because despite legitimate use cases to address actual problems posing high-cost risks to users (even as Google turns out to be a shitty authority) & clear documentation that power users can still install any package they want, they choose to catastrophize.


  • It’s like you guys think forgery is new & provenance/chain of custody doesn’t exist.

    rant about checking sources

    People here uncritically responding to screenshots lacking links to sources are part of the problem. They discourage linking to disfavored platforms as if that serves some moral cause when they’re making the problem worse. They’re training people not to verify & authenticate information by obstructing the most efficient check. So of course those that don’t check sources think we’re doomed when it’s incredibly simple to do.

    The importance of checking & citing references isn’t new & doesn’t change with technology. Establishing information integrity has always been necessary to argue truth.







  • I don’t think you should comment on security if “open source” means anything to you

    Anyone can look at the source, brah, and security auditors do.

    For finding backdoors binary disassembly is almost as easy or hard as looking in that “open source”.

    Are you in the dark ages? Beyond code review, there are all kinds of automations to catch vulnerabilities early in the development process, and static code analysis is one of the most powerful.

    Analysts review the design & code, subject it to various security analyzers including those that inspect source code, analyze dependencies, check data flow, test dynamically at runtime.

    There are implementations of some mechanisms from Signal.

    Right, the protocol.

    Can you confidently describe

    Stop right there: I don’t need to. It’s wide open for review by anyone in the public including independent security analysts who’ve reviewed the system & published their findings. That suffices.

    Do security researches have to say anything on DARPA that funds many of them?

    They don’t. Again, anyone in the public including free agents can & do participate. The scholarly materials & training on this aren’t exactly secret.

    Information security analysts aren’t exceptional people and analyzing that sort of system would be fairly unexceptional to them.

    Oh, the surveillance state will be fine in any case!

    Even with state-level resources, it’s pretty well understood some mathematical problems underpinning cryptography are computationally beyond the reach of current hardware to solve in any reasonable amount of time. That cryptography is straightforward to implement by any competent programmer.

    Legally obligating backdoors only limits true information security to criminals while compromising the security of everyone else.

    I do agree, though: the surveillance state has so many resources to surveil that it doesn’t need another one.


  • You misidentified your objection. It isn’t sideloading removal, which isn’t happening. It’s developer verification, which affects the sideloading that remains available.

    Just because you don’t understand the value of verifying signatures doesn’t mean it lacks value.

    I recall the same alarm over secureboot: there, too, we can (load our certificates into secureboot and) sign everything ourselves. This locks down the system from boot-time attacks.

    I will never ever ever be able to get friends and family access to third-party applications after this change.

    Then sign it: problem solved.

    Developer verification should also give them a hard enough time to install trash that fucks their system and steals their information when that trash is unsigned or signed & suspended.

    Even so, it’s mentioned only in regard to devices certified for and that ship with Play Protect, which I’m pretty sure can be disabled.

    Google promised they would allow on-device sideloading

    Promise kept.

    their word means fuck-all and you know that

    No, I don’t. Developers are always going to need some way to load their unfinished work.


  • Google will soon stop you sideloading unverified apps

    unverified

    ie, unsigned, so they are not

    fighting tooth & nail to remove side loading too

    Sideloading is still available: you can sign it yourself or bypass verification with adb as they documented.

    Will Android Debug Bridge (ADB) install work without registration? As a developer, you are free to install apps without verification with ADB.

    If I want to modify or hack some apk and install it on my own device, do I have to verify? Apps installed using ADB won’t require verification.

    So, cool misinformation.



  • I don’t think you understand anything you wrote about. Signal is open source, is publicly audited by security researchers, and publishes its protocol, which has multiple implementations in other applications. Messages are encrypted end-to-end, so the only weaknesses are the endpoints: the sender or recipients.

    Security researchers generally agree that backdoors introduce vulnerabilities that render security protocols unsound. Other than create opportunities for cybercriminals to exploit, they only serve to amplify the powers of the surveillance state to invade the privacy of individuals.