🔗 David Sommerseth

F/OSS hacker, mostly working on #OpenVPN
- speaks only for himself.
ex-Twitter account (now inaccessible): https://twitter.com/DavidSommerseth

“Don’t aim to be someone. DO something.”

#nobridge - because I believe in the real #fediverse, and I don’t want my own views/data to be abused by yet another “closed-service which can do whatever it wants for profit”.

If you want to follow me, you will now need to have some content on your profile where we have some common ground on interests. I will no longer accept random profiles wanting to follow with no toots or few other follows or followers in the same interest sphere.

  • 3 Posts
  • 42 Comments
Joined 2 years ago
cake
Cake day: December 28th, 2022

help-circle
  • @abobla

    I kinda struggle to believe it’s that difficult. I mean, Tresorit has a pretty good and functional Linux client. What have they done which makes it sustainable for them?

    Filen.io also has a pure sync-client, which is distributed as an AppImage. This also works, but the FUSE integration Tresorit provides is quite awesome and performing quite decently.

    I would actually recommend Proton to start the development on an older Linux distro. Like RHEL/Alma/Rocky 9 or Debian 11 (which is EOL, though) and make it run there. Moving from that distro to newer distros will then go smother and you’ll get other distros supported quicker.

    The mistake too many Linux efforts does is to take the “latest and greatest” distro version - often coupled with what a single Linux developer considers the “most used distro” and then hits lots of challenging when needing to support older distros. That’s going to be painful.

    @protonprivacy Please take note and forward to Andy and other managers.




  • @Dark_Arc @bl4kers

    I can understand the confusion. But it kinda makes sense… if my hypothesis is correct.

    Proton Drive has the concepts of “My Files” and “Computers”. Files stored under “Computer” (where you can have synced files for up to 10 computers, according to docs) tracks the files for each computer individually.

    So when you uninstall Drive and delete the files, they are only stored in the cloud. But after reinstalling it again, it sees the files locally for that computer is gone … so it gets removed in the cloud.

    Had these files been moved to “My Files” in before the reinstall, this should not have happened.

    At least, that’s my theory.






  • @testeronious

    So I spent a little bit time to dig up what Notion is.
    This is what I found when searching for it … https://www.notion.so/about

    And I honestly have no idea why Skiff would be interesting for Notion. From what I can grasp the only Notion features overlap are Skiff Pages and perhaps Skiff Calendar. It’s so off I struggle to fully grasp this.

    First of all, Notion is not a service talking about privacy at all, afaict. And that was one of the main arguments Skiff had.

    And then the first thing this merges states is that Skiff services are closing down.

    I hate to say this, but Skiff founders couldn’t really have cared that much about privacy then, when they chose to close down so quickly and abruptly like that, without a continuation plan on bringing privacy to Notion.

    I believe the Skiff founders, if they really cared strongly about privacy, realised their service was not sustainable in a longer run, with too high running cost and too low income. In addition they might have seen that they would need to invest a lot more into further development and that it was too hard to improve their revenue stream. So the alternative was either to go down with a bang (bankruptcy), or they could sell “something” to another company and make it sound nicer.

    Right now I just wonder what Skiff managed to actually sell to Notion. Most likely manpower, if I should guess.


  • @Rookwood @testeronious

    Tuta seems to be driven by idealists and privacy activists as well. AFAIK, they also don’t have venture capital and their user base of paying users is what keeps them alive. Which is also why it’s still a small company.

    I don’t recall how Tuta got their initial funding to get startet. I don’t think they were crowdfunded in the same way Proton did.

    But the idealsism goals of both Tuta and Proton is what generally makes it less likely they will sell out.

    AFAIR, Skiff was VC funded. The idealism of the founders are easily ignored when the VC backing wants to cash in on their investments. And that’s what happened here, in some way or another.








  • @unruhe @protonprivacy

    I thought a bit more on these complaints since this post. And I realised these complaints can also be ignored by applying some basic mathematics and common sense.

    Proton has more than 100 million users by now. So let’s say 100 million in this example. How many public complaints would it need to be from these users to really “catch fire”? Meaning - how often do you read about complaints and from how many users? More than 100.000 users? Okay. Let’s say there are 1 million dissatisfied users.

    If half of that million users complained loudly on the Internet, I would say that would probably be quite noticeable. Media would most likely pick it up, and it would brew up to media storm right?

    Have you noticed anything like that? Do you see that many users complaining?

    And if yes, that would still only represent 0.5% of the whole user base of Proton. If you include the other half complaining “silently”, it would represent 1% of the Proton users.

    That still leaves 99% users which are at least to some degree satisfied with Proton.

    Even if you pull it up to 20 million dissatisfied users, they would still be in the minority compared to users finding Proton’s services being just fine. And 20 million dissatisfied users - that would definitely have caused some media traction, don’t you think?


  • @amju_wolf

    They could even have a Fedora Copr repo, where they push out the updated .spec file and get a proper package build for all Fedora, RHEL/CentOS and more distros. With proper RPM packaging and repository. Push a new build and all users gets an updated package at their next update cycle.

    That’s a reasonable path to get started with preparing packages to become part of the native yum/dnf repos at least. And that across a lot of distributions and releases in a single go.





  • @Prototype9215 @LunchEnjoyer @LinkOpensChest_wav

    That’s what really happens when @protonmail insists on doing everything on their own, not even doing the continuous development in the open. They provide source code updates only on stable releases, and even that can be delayed some days until after the release.

    That’s not how you build a community of users, developers and package maintainers.

    Had they instead spent resources getting their Linux packages into the native package streams for the most important distros, they would have solved more bugs earlier with help from the community.

    That is probably the most disappointing aspect of Proton. They still don’t grasp how to interact with a broader community, to get real help.

    They would still need to review contributions, just as I expect they do with changes from their own employees. So it wouldn’t reduce the security.

    Also, they can’t really hide behind the code not being ready to be published; they code is being published in the end.

    But they really miss the opportunity to get their packages into the standard Lunux repositories. Which would help resolving all the incompatibility issues they now have with certain Linux distributions.

    On top of that, all the needed tooling required already exists. It just need to implemented correctly in their processes.