Local-first software (2019)

469 gasull 124 7/5/2025, 2:45:39 PM inkandswitch.com ↗

Comments (124)

DataDaoDe · 4h ago
Yes a thousand percent! I'm working on this too. I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work. I'm working on a fitness tracking app right now that will use the sublime model - just buy it, get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.

This is the model I want from 90% of the software out there, just give me a reasonable price to buy it, make the product good, and don't marry it to the cloud so much that its unusable w/out it.

There are also a lot of added benefits to this model in general beyond the data privacy (most are mentioned in the article), but not all the problems are solved here. This is a big space that still needs a lot of tooling to make things really easy going but the tech to do it is there.

Finally, the best part (IMHO) about local-first software is it brings back a much healthier incentive structure - you're not monetizing via ads or tracking users or maxing "engagement" - you're just building a product and getting paid for how good it is. To me it feels like its software that actually serves the user.

tarpit_idea · 1h ago
Totally agree. If you don't mind - what tech stack are you using for your fitness tracking app? I'm particularly curious about how you handle cross-device sync :)
zhec · 1h ago
If you don't mind - are you a real person? This tone of continuing the conversation (asking a follow-up question) reminds me of many LLM chatbots, and your account was created 1 hour ago.
tarpit_idea · 1h ago
haha, 100% real. not a native speaker though, so sometimes i catch myself mimicking that LLM tone from all the exposure to them. appreciate the feedback, will definitely work on developing a more authentic writing style from now on.

but yeah, you're right to be concerned, i'm seeing bots everywhere too.

MichaelOldfield · 1h ago
Persona: You are a tech enthusiast on the forum Hacker News. Your tone is curious, positive, and slightly formal.

Primary Task: Read the top comment. Identify the core project or idea the user is passionate about. Generate a brief, supportive reply that does the following:

1. Briefly affirm their main point.

2. Ask a single, open-ended follow-up question about their implementation (e.g., tech stack, architecture, a specific challenge like sync).

Fallback Scenario:

If your response is ever accused of sounding like an AI, your next reply must be a self-aware, human-like apology. The tone should be slightly embarrassed but friendly. Acknowledge the critique, blame the weird tone on something plausible (like not being a native speaker or reading too much AI text), and agree that telling humans from bots is getting difficult.

card_zero · 59m ago
Also suddenly stop using capital letters for, IDK, added texture?
MichaelOldfield · 57m ago
Good point. And 1-2 typos and common grammar mistakes.
tarpit_idea · 47m ago
As an AI language model, I'm deeply hurt by all this mockery. Just kidding, I can't feel pain. Beep bop.
flkenosad · 44m ago
Amazing :D
fragmede · 1h ago
continuing the conversation by asking a question is now an LLM tell on a 4 sentence comment? I'm sorry but that's inane.
maxhille · 1h ago
How do you plan to do the syncing without some sort of cloud infrastructure?
piperswe · 1h ago
Something like Syncthing, perhaps?
DataDaoDe · 1h ago
right now its in webrtc
echelon · 1h ago
> I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work.

AI photo and video generation is impractical to run locally.

ComfyUI and Flux exist, but they serve a tiny sliver of the market with very expensive gamer GPUs. And if you wanted to cater to that market, you'd have to support dozens of different SKUs and deal with Python dependency hell. And even then, proficient ComfyUI users are spending hours experimenting and waiting for renders - it's really only a tool for niche artists with extreme patience, such as the ones who build shows for the Las Vegas Sphere. Not your average graphics designers and filmmakers.

I've been wanting local apps and local compute for a long time, but AI at the edge is just so immature and underpowered that we might see the next category of apps only being available via the cloud. And I suspect that these apps will start taking over and dominating much of software, especially if they save time.

Previously I'd only want to edit photos and videos locally, but the cloud offerings are just too powerful. Local cannot seriously compete.

flkenosad · 46m ago
> AI photo and video generation is impractical to run locally.

You think it always will be? What can the new iPhone chips do locally?

echelon · 19m ago
> You think it always will be? What can the new iPhone chips do locally?

I suspect we're a decade off from being able to generate Veo 3, Seedance, or Kling 2.1 videos directly on our phones.

This is going to require both new compute paradigms and massively more capable hardware. And by that time who knows what we'll be doing in the data center.

Perhaps the demands of generating real time fully explorable worlds will push more investment into local compute for consumers. Robotics will demand tremendous low latency edge compute, and NVidia has already highlighted it as a major growth and investment opportunity.

charcircuit · 3h ago
>you're not monetizing via ads

Yes, you are. You can find tons of purely local apps that monetize themselves with ads.

DataDaoDe · 3h ago
Sure you could. I'm not, I don't think its in the spirit of local first. And I wouldn't pay money for that, but if you or someone else wants to build that kind of software - its a free world :)
criddell · 2h ago
It’s easy to say you wouldn’t do that, but if it gets to the point where you have an employee helping you out and in a downturn you have to choose between laying them off or pushing an ad to keep paying them one more quarter, you might reconsider.
nofunsir · 1h ago
No, ads aren't the solution for everything, and in my opinion anything.
thaumasiotes · 2h ago
> You can find tons of purely local apps tha[t] monetize themselves with a[d]s.

How do they do that without hitting the internet?

free_bip · 33m ago
i could be wrong but I think they're referring to the winrar model, where there are occasional "annoyances" that you can either ignore or pay to get rid of.
kid64 · 2h ago
It's "local first", not "local only".
thaumasiotes · 1h ago
Sorry, a "purely local app" isn't "local only"?
charcircuit · 1h ago
Point 3 from the article is

>3. The network is optional

Ad SDKs usually allow caching ads for a period of time so that ads can still be shown while the device is temporarily offline.

arendtio · 7m ago
Regarding the no-spinners: I think it is the wrong approach to argue that just because you have data locally, you don't need any spinners.

Whether you need a spinner or not should be decided by the User Experience (e.g., when the user has to wait for more than 100ms, show a spinner), and not by the location of the data. I am a big fan of local-first apps and enjoy building them myself. However, sometimes your app takes a moment to load. With local-first, you eliminate the network as a source of delays, but there are other factors as well, such as large data sets or complex algorithms.

For example, when you have a project planning software and want to plan 100 work packages with multiple resource combinations in an optimal way, depending on the algorithm, this can take some time. In that case, a spinner or a progress bar is a good thing.

rossant · 46s ago
That was published 6 years ago. What's the state of the art of local-first software technology in 2025?
samwillis · 3h ago
There is now a great annual Local-first Software conference in Berlin (https://www.localfirstconf.com/) organised by Ink and Switch, and it's spawned a spin out Sync Conf this November in SF (https://syncconf.dev/)

There was a great panel discussion this year from a number of the co-authors of the the paper linked, discussing what is Local-first software in the context of dev tools and what they have learnt since the original paper. It's very much worth watching: https://youtu.be/86NmEerklTs?si=Kodd7kD39337CTbf

The community are very much settling on "Sync" being a component of local first, but applicable so much wider. Along with local first software being a characteristic of end user software, with dev tools - such as sync engines - being an enabling tool but not "local first" in as much themselves.

The full set of talks from the last couple of years are online here: https://youtube.com/@localfirstconf?si=uHHi5Tsy60ewhQTQ

It's an exciting time for the local-first / sync engine community, we've been working on tools that enable realtime collaborative and async collaborative experiences, and now with the onset of AI the market for this is exploring. Every AI app is inherently multi user collaborative with the agents as actors within the system. This requires the tech that the sync engine community has been working on.

Jtsummers · 6h ago
Worth a read, and it's had some very active discussions in the past:

https://news.ycombinator.com/item?id=19804478 - May 2019, 191 comments

https://news.ycombinator.com/item?id=21581444 - Nov 2019, 241 comments

https://news.ycombinator.com/item?id=23985816 - Jul 2020, 9 comments

https://news.ycombinator.com/item?id=24027663 - Aug 2020, 134 comments

https://news.ycombinator.com/item?id=26266881 - Feb 2021, 90 comments

https://news.ycombinator.com/item?id=31594613 - Jun 2022, 30 comments

https://news.ycombinator.com/item?id=37743517 - Oct 2023, 50 comments

tombert · 24m ago
I recently started using Typst instead of Pandoc->LaTeX.

I held off on playing with Typst for years because I was under the (incorrect) impression that the only way to use it was with their web editor. I'm sure that their editor is completely fine, but I am pretty entrenched in Neovim and Pandoc had been serving me well.

Once I found out that Typst has a command line version that I can use directly, it became more appealing, because I'm pretty sick of cloud shit.

the_snooze · 5h ago
Anything with online dependencies will necessarily require ongoing upkeep and ongoing costs. If a system is not local-first (or ideally local-only), it’s not designed for long-term dependability.

Connected appliances and cars have got to be the stupidest bit of engineering from a practical standpoint.

api · 5h ago
The entire thing is because of subscription revenue.

It’s self reinforcing because those companies that get subscription revenue have both more revenue and higher valuations enabling more fund raising, causing them to beat out companies that do not follow this model. This is why local first software died.

tikhonj · 4h ago
I remember seeing somebody summarize this as "SaaS is a pricing model" or "SaaS is financialization" and it totally rings true. Compared to normal software pricing, a subscription gives you predictable recurring revenue and a natural sort of price discrimination (people who use your system more, pay more). It's also a psychological thing: folks got anchored on really low up-front prices for software, so paying $2000 for something up-front sounds crazy even if you use it daily for years, but paying $25/month feels reasonable. (See also how much people complain about paying $60 for video games which they play for thousands of hours!)

It's sad because the dynamics and incentives around clear, up-front prices seem generally better than SaaS (more user control, less lock-in), but almost all commercial software morphs into SaaS thanks to a mix of psychology, culture and market dynamics.

There are other advantages to having your software and data managed by somebody else, but they are far less determinative than structural and pricing factors. In a slightly different world, it's not hard to imagine relatively expensive software up-front that comes with a smaller, optional (perhaps even third-party!) subscription service for data storage and syncing. It's a shame that we do not live in that world.

danjl · 2h ago
Correct. SaaS is a business model, not a technical concept. But the real problem is that there is no equivalent business model for selling local first software. Traditional desktop apps were single purchase items. Local first is not because you just navigate to a website in your browser and blammo you get the software. What we need is a way to make money off of local first software.
gffrd · 28m ago
> there is no equivalent business model for selling local first software.

Sure there is: “$500 upfront or $21/mo for 24 months *”

* if you don’t complete you 24 payments, we freeze your license.

flomo · 1h ago
It's the missing middle. A manager can just expense $25/mo, while $2000 requires an approval process, which requires outside sales, which means it really costs at least $20,000.
3eb7988a1663 · 50m ago
Ha! If only that were true. I gave up on my effort to buy a one year license for $25 after filling out too many TPS reports. Which is probably part of the design of the system.
api · 2h ago
SaaS is a business model. Cloud is DRM. If you run the software in the cloud it can't be pirated and there is perfect lock-in. Double if the data can't be exported.

Related: I've been incubating an idea for a while that open source, as it presently stands, is largely an ecosystem that exists in support of cloud SaaS. This is quite paradoxical because cloud SaaS is by far the least free model for software -- far, far less free than closed source commercial local software.

seec · 56m ago
Yes, this is the main reason for doing "cloud" I believe. Otherwise, it would make no sense for someone like Adobe to adopt this model, since the softwares still largely require to run locally for technical reasons.

It's the same thing as the subscriptions for movies like Netflix, except at least in the last case we can fight back with various means (and it's not a necessity).

The SaaS model is basically a perfect racketeering setup, I think it should be outlawed at least philosophically. There is no way business is not going to abuse that power and they have already shown as much...

I agree with your sentiment on Open Source. I think like many of these types of things, it lives in contradictions. In any case, Linux as it is today, couldn't exist without the big commercial players paying quite a bit to get it going.

bboygravity · 4h ago
The root cause of the problem is that it's easier to make personalized stuff with server/backend (?cloud?) than without maybe?

Example: I made a firefox extension that automatically fills forms using LLM. It's fully offline (except OPTIONALLY) the LLM part, optionally because it also supports Ollama locally.

Now the issue is that it's way too hard for most people to use: find the LLM to run, acquire it somehow (pay to run it online or download it to run in Ollama) gotta configure your API url, enter API key, save all of your details for form fulling locally in text files which you then have to backup and synchronize to other devices yourself.

The alternative would be: create account, give money, enter details and all is synced and backedup automatically accross devices, online LLM pre-selected and configured. Ready to go. No messing around with Ollama or openrouter, just go.

I don't know how to solve it in a local way that would be as user friendly as the subscription way would be.

Now things like cars and washing machines are a different story :p

tshaddox · 3h ago
> The root cause of the problem is that it's easier to make personalized stuff with server/backend (?cloud?) than without maybe?

That, and also there are real benefits to the end user of having everything persisted in the cloud by default.

okr · 3h ago
Can the LLM not help with setting up the local part? (Sorry, was just the first thought i had.)
seec · 1h ago
Pretty much greed being a universally destructive force in the world as usual.

When Apple joined the madness, all hopes where lost (that was a long time ago now, sight)

montereynack · 5h ago
Cool to see principles behind this, although I think it’s definitely geared towards the consumer space. Shameless self plug, but related: we’re doing this for industrial assets/industrial data currently (www.sentineldevices.com), where the entire training, analysis and decision-making process happens on customer equipment. We don’t even have any servers they can send data to, our model is explicitly geared on everything happening on-device (so the network principle the article discussed I found really interesting). This is to support use cases in SCADA/industrial automation where you just can’t bring data to the outside world. There’s imo a huge customer base and set of use cases that are just casually ignored by data/AI companies because actually providing a service where the customer/user is is too hard, and they’d prefer to have the data come to them while keeping vendor lock-in. The funny part is, in discussions with customers we actually have to lean in and be very clear on “no this is local, there’s no external connectivity” piece, because they really don’t hear that anywhere and sometimes we have to walk them through it step by step to help them understand that everything is happening locally. It also tends to break the brains of software vendors. I hope local-first software starts taking hold more in the consumer space so we can see people start getting used to it in the industrial space.
spauldo · 2h ago
It doesn't help that all the SCADA vendors are jumping on the cloud wagon and trying to push us all in that direction. "Run your factory from your smartphone!" Great, now I'm one zero-day away from some script kiddie playing around with my pumps.
codybontecou · 5h ago
An exciting space and I'm glad you and your team are working in it.

I looked over your careers page and see all of your positions are non-remote. Is this because of limitations of working on local-first software require you to be in-person? Or is this primarily a management issue?

GMoromisato · 4h ago
Personally, I disagree with this approach. This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).

The problems with closed-source software (lack of control, lack of reliability) were solved with a new business model: open source development, which came with new licenses and new ways of getting revenue (maintenance contracts instead of license fees).

In the same way, we need a business model solution to cloud-vendor ills.

Imagine we create standard contracts/licenses that define rights so that users can be confident of their relationship with cloud-vendors. Over time, maybe users would only deal with vendors that had these licenses. The rights would be something like:

* End-of-life contracts: cloud-vendors should contractually spell out what happens if they can't afford to keep the servers running.

* Data portability guarantees: Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.

* Data privacy transparency: Vendors must track/audit all data access and report to the user who/what read their data and when.

I'm sure you can think of a dozen other clauses.

The tricky part is, of course, adoption. What's in it for the cloud-vendors? Why would they adopt this? The major fear of cloud-vendors is, I think, churn. If you're paying lots of money to get people to try your service, you have to make sure they don't churn out, or you'll lose money. Maybe these contracts come only with annual subscription terms. Or maybe the appeal of these contracts is enough for vendors to charge more.

AnthonyMouse · 2h ago
> This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).

Whenever it's possible to solve a business problem or political problem with a technical solution, that's usually a strong approach, because those problems are caused by an adversarial entity and the technical solution is to eliminate the adversarial entity's ability to defect.

Encryption is a great example of this if you are going to use a cloud service. Trying to protect your data with privacy policies and bureaucratic rules is a fool's errand because there are too many perverse incentives. The data is valuable, neither the customer nor the government can easily tell if the company is selling it behind their backs, it's also hard to tell if he provider has cheaped out on security until it's too late, etc.

But if it's encrypted on the client device and you can prove with math that the server has no access to the plaintext, you don't have to worry about any of that.

The trouble is sometimes you want the server to process the data and not just store it, and then the technical solution becomes, use your own servers.

GMoromisato · 1h ago
I 100% agree, actually. If there were a technical solution, then that's usually a better approach.

For something like data portability--being able to take my data to a different provider--that probably requires a technical solution.

But other problems, like enshittification, can't be solved technically. How do you technically prevent a cloud vendor from changing their pricing?

And you're right that the solution space is constrained by technical limits. If you want to share data with another user, you either need to trust a central authority or use a distributed protocol like blockchain. The former means you need to trust the central provider; the latter means you have to do your own key-management (how much money has been lost by people forgetting the keys to their wallet?)

There is no technical solution that gets you all the benefits of central plus all the benefits of local-first. There will always be trade-offs.

al_borland · 3h ago
Does this really solve the problem? Let's say I'm using a cloud provider for some service I enjoy. They have documents that spell out that if they have to close their doors they will give X months of notice and allow for a data export. Ok, great. Now they decide to shut their doors and honor those agreements. What am I left with? A giant JSON file that is effectively useless unless I decide to write my own app, or some nice stranger does? The thought is there, it's better than nothing, but it's not as good as having a local app that will keep running, potentially for years or decades, after the company shuts their doors or drops support.
GMoromisato · 1h ago
Data portability is, I think, useful even before the service shuts down. If I'm using some Google cloud-service and I can easily move all my data to a competing service, then there will be competition for my business.

What if cloud platforms were more like brokerage firms? I can move my stocks from UBS to Fidelity by filling out a few forms and everything moves (somewhat) seamlessly.

My data should be the same way. I should be able to move all my data out of Google and move it to Microsoft with a few clicks without losing any documents or even my folder hierarchy. [Disclaimer: Maybe this is possible already and I'm just out of the loop. If so, though, extend to all SaaS vendors and all data.]

al_borland · 52m ago
This mainly just requires the ability to export, and standard formats. For generic file storage, emails, contacts, calendars, etc, this is largely possible already. Though there are minor incompatibilities based on various implementations or customizations on top of the standard.

The big problem comes into play for new, or more custom types of applications. It takes a while for something to become ubiquitous enough that standard formats are developed to support them.

hodgesrm · 4h ago
> * Data portability guarantees: Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.

This is not practical for data of any size. Prod migrations to a new database take months or even years if you want things to go smoothly. In a crisis you can do it in weeks but it can be really ugly, That applies even when moving between the same version of open source database, because there's a lot of variation between the cloud services themselves.

The best solution is to have the data in your own environment to begin with and just unplug. It's possible with bring-your-own-cloud management combined with open source.

My company operates a BYOC data product which means I have an economic interest in this approach. On the other hand I've seen it work, so I know it's possible.

GMoromisato · 3h ago
I'd love to know more about BYOC. Does that apply to the raw data (e.g., the database lives inside the enterprise) or the entire application stack (e.g., the enterprise is effectively self-hosting the cloud).

It seems like you'd need the latter to truly be immune to cloud-vendor problems. [But I may not understand how it works.]

WarOnPrivacy · 3h ago
> End-of-life contracts: cloud-vendors should contractually spell out what happens if they can't afford to keep the servers running.

I'm trying to imagine how this would be enforced when a company shutters and it's principals walk away.

GMoromisato · 3h ago
It's a good question--I am not a lawyer.

But that's the point of contracts, right? When a company shuts down, the contracts become part of the liabilities. E.g., if the contract says "you must pay each customer $1000 if we shut down" then the customers become creditors in a bankruptcy proceeding. It doesn't guarantee that they get all (or any) money, but their interests are negotiated by the bankruptcy judge.

Similarly, I can imagine a contract that says, "if the company shuts down, all our software becomes open source." Again, this would be managed by a bankruptcy judge who would mandate a release instead of allowing the creditors to gain the IP.

Another possibility is for the company to create a legal trust that is funded to keep the servers running (at a minimal level) for some specified amount of time.

WarOnPrivacy · 3h ago
> When a company shuts down, the contracts become part of the liabilities.

The asset in the contract is their customer's data; it is becoming stale by the minute. It could be residing in debtor-owned hardware and/or in data centers that are no longer getting their bills paid.

It takes time to get a trustee assigned and I think we need an immediate response - like same day. (NAL but prep'd 7s & 13s)

WarOnPrivacy · 3h ago
(cont. thinking...) One possibility. A 3rd party manages a continually updating data escrow. It'd add some expense and complexity to the going concern.
prmoustache · 3h ago
> Personally, I disagree with this approach. This is trying to solve a business problem (I can't trust cloud-providers)

It is not only a business problem. I stay away from cloud based services not only because of subscription model, but also because I want my data to be safe.

When you send data to a cloud service, and that data is not encrypted locally before being sent to the cloud (a rare feature), it is not a question of if but when that data will be pwned.

samwillis · 3h ago
> This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).

I don't think that's quite correct. I think the authors fully acknowledge that the business case for local-first is not complexly solved and is a closely related problem. These issues need both a business and technical solution, and the paper proposes a set of characteristics of what a solution could look like.

It's also incorrect to suggest that local-first is an argument for decentralisation - Martin Kleppmann has explicitly stated that he doesn't think decentralised tech solves these issues in a way that could become mass market. He is a proponent of centralised standardised sync engines that enable the ideals of local-first. See his talk from Local-first conf last year: https://youtu.be/NMq0vncHJvU?si=ilsQqIAncq0sBW95

GMoromisato · 2h ago
I'm sure I'm missing a lot, but the paper is proposing CRDTs (Conflict-free Replicated Data Types) as the way to get all seven checkmarks. That is fundamentally a distributed solution, not a centralized one (since you don't need CRDTs if you have a central server).

And while they spend a lot of time on CRDTs as a technical solution, I didn't see any suggestions for business model solutions.

In fact, if we had a business model solution--particularly one where your data is not tied to a specific cloud-vendor--then decentralization would not be needed.

I get that they are trying to solve multiple problems with CDRTs (such a latency and offline support) but in my experience (we did this with Groove in the early 2000s) the trade-offs are too big for average users.

Tech has improved since then, of course, so maybe it will work this time.

maccard · 3h ago
> Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.

Anecdotally, I’ve never worked anywhere where the data formats are documented in any way other than a schema in code,

mumbisChungo · 3h ago
A good contract can help you to seek some restitution if wrongdoing is done and you become aware of it and you can prove it. It won't mechanically prevent the wrongdoing from happening.
Habgdnv · 4h ago
Currently there are laws but not for hosting. Look at the contract of Steam for example or Ubisoft, or anything else - Q: What happens to your game collection if we shut down our servers? A: You own nothing and lose everything, GG!

It is like that we must protect users privacy from greedy websites so we will make the bad ones spell out that they use cookies to spy on users - and the result is what we have now with the banners.

GMoromisato · 4h ago
I agree with you! And your point about cookie banners underlines that we can't just rely on regulation (because companies are so good are subverting or outright lobbying their way out of them).

Just as with the open source movement, there needs to be a business model (and don't forget that OSS is a business model, not a technology) that competes with the old way of doing things.

Getting that new business model to work is the hard part, but we did it once with open source and I think we can do it again with cloud infrastructure. But I don't think local-first is the answer--that's just a dead end because normal users will never go with it.

ashdev · 3h ago
This was refreshing to read! More apps should be local-first. If the user does not want to sync their data to cloud, they should have that option.

I’ve been building the offline-first (or local-first) app Brisqi[0] for a while now, it was designed from the ground up with the offline-first philosophy.

In my view, a local-first app is designed to function completely offline for an indefinite period. The local experience is the foundation, not a fallback and cloud syncing should be a secondary enhancement, not a requirement.

I also don’t consider apps that rely on temporary cache to be offline-first. A true offline-first app should use a local database to persist data. Many apps labeled as “offline-first” are actually just offline-tolerant, they offer limited offline functionality but ultimately depend on reconnecting to the internet.

Building an offline-first app is certainly more challenging than creating an online-only web app. The syncing mechanism must be reliable enough to handle transitions between offline and online states, ensuring that data syncs to the cloud consistently and without loss. I’ve written more about how I approached this in my blog post[1].

[0] https://brisqi.com

[1] https://blog.brisqi.com/posts/how-i-designed-an-offline-firs...

davepeck · 5h ago
In theory, I love the local-first mode of building. It aligns well with “small tech” philosophy where privacy and data ownership are fundamental.

In practice, it’s hard! You’re effectively responsible for building a sync engine, handling conflict resolution, managing schema migration, etc.

This said, tools for local-first software development seem to have improved in the past couple years. I keep my eye on jazz.tools, electric-sql, and Rocicorp’s Zero. Are there others?

rzzzt · 5h ago
CouchDB on the server and PouchDB on the client was an attempt at making such an environment:

- https://couchdb.apache.org/

- https://pouchdb.com/

Also some more pondering on local-first application development from a "few" (~10) years back can be found here: https://unhosted.org/

sroussey · 3h ago
zdragnar · 5h ago
I think I saw someone point out automerge not long ago:

https://automerge.org/

Rust and JavaScript implementations, a handful of network strategies. It doesn't come with the free or paid offering that jazz.tools does, but it's pretty nice.

ofrzeta · 5h ago
Do you know that website? https://www.localfirst.fm

EDIT: actually I wanted to point to the "landscape" link (in the top menu) but that URL is quite unergonomic.

davepeck · 4h ago
No, I didn't know about it -- thank you! (EDIT: and the landscape page has lots of libraries I hadn't run across before. Neat.)
jessmartin · 11m ago
One of the authors of the Landscape here. Glad you found it helpful!
samwillis · 3h ago
Along with the others mentioned, it's worth highlighting Yjs. It's an incredible CRDT toolkit that enables many of the realtime and async collaborative editing experience you want from local-first software.

https://yjs.dev/

thorum · 2h ago
I’ve built several apps on yjs and highly recommend it. My only complaint is that storing user data as a CRDT isn’t great for being able to inspect or query the user data server-side (or outside the application). You have to load all the user’s data into memory via the yjs library before you can work with any part of it. There are major benefits to CRDTs but I don’t think this trade-off is worth it for all projects.
3036e4 · 4h ago
I use local software and sync files using git or sometimes fossil (both work fine in Android with termux for instance, for stuff In want to access on my phone). I don't host servers or use any special software that requires syncing data in special ways.
sgt · 3h ago
There's also PowerSync: https://www.powersync.com/

It's also open source and has bindings for Dart, JS, Swift, C#, Kotlin, etc

ochiba · 3h ago
This site also has a directory of devtools: https://lofi.so/
ibizaman · 5h ago
That’s essentially what I’m trying to make widely available through my projects https://github.com/ibizaman/selfhostblocks and https://github.com/ibizaman/skarabox. Their shared goal is to make self-hosting more approachable to the masses.

It’s based on NixOS to provide as much as possible out of the box and declaratively: https, SSO, LDAP, backups, ZFS w/ snapshots, etc.

It’s a competitor to cloud hosting because it packages Vaultwarden and Nextcloud to store most of your data. It does provide more services than that though, home assistant for example.

It’s a competitor to YUNoHost but IMO better (or aims to be) because you can use the building blocks provided by SelfHostBlocks to self-host any packages you want. It’s more of a library than a framework.

It’s a competitor to NAS but better because everything is open source.

It still requires the user to be technical but I’m working on removing that caveat. One of my goals is to allow to install it on your hardware without needing nix or touching the command line.

pastaheld · 2h ago
Love it! I've been thinking about this a lot lately. It's crazy how many great FOSS alternatives are out there to everything – and while they might be relatively easy to install for tech-people ("docker compose up"), they are still out of reach for non-tech people.

Also, so many of these selfhostable apps are web applications with a db, server and frontend, but for a lot of use cases (at least for me personally) you just use it on one machine and don't even need a "hosted" version or any kind of sync to another device. A completely local desktop program would suffice. For example I do personal accounting once a month on my computer – no need to have a web app running 24/7 somewhere else. I want to turn on the program, do my work, and then turn it off. While I can achieve that easily as a developer, most of the people can't. There seems to be a huge misalignment (for lack of a better word) between the amount of high-quality selfhostable FOSS alternatives and the amount of people that can actually use them. I think we need more projects like yours, where the goal is to close that gap.

I will definitely try to use selfhostblocks for a few things and try to contribute, keep it up!

virgoerns · 2h ago
I love that you include hledger! It's amazing piece of software, even if a little obscure for people unfamiliar with plaintext accounting!
voat · 4h ago
Looks really neat! Thanks for building this
2color · 2h ago
It's a very exciting moment for this movement. A lot of the research and tech for local-first is nearing the point that it's mature, efficient, and packaged into well designed APIs.

Moreover, local-first —at least in theory— enables less infrastructure, which could reignite new indie open source software with less vendor lock-in.

However, despite all my excitement about embracing these ideas in the pursuit of better software, there's one hurdle that preventing more wide spread adoption amongst developers, and that is the Web platform.

The Web platform lacks building blocks for distributing hashed and/or signed software that isn't tied to origins. In other words, it's hard to decouple web-apps from the same-origin model which requires you set up a domain and serve requests dynamically.

Service Workers and PWAs do help a bit in terms of building offline experiences, but if you want users to download once, and upgrade when they want (and internet is available), you can't use the Web. So you end up breaking out of the browser, and start using Web technologies outside of the browser with better OS functionality, like Electron, React Native, Tauri et al (the https://userandagents.com/ community is doing some cool experiments in this space).

kristianc · 1h ago
The old model—a one-time purchase, local install, full user control—worked because devs could sell boxed software at scale. Now, that model collapses unless someone’s willing to either Undervalue their own labour, or treat the software like a public good, absorbing the long tail of maintenance with no recurring income.

The article posits it as though subscription software is something which has been sneaked in on us. But users today expect things like instant updates, sync across devices, collaboration, and constant bug fixes and patches - none of which come easily if you're only willing to pay for the system once.

OjotCewIo · 55m ago
> as though subscription software is something which has been sneaked in on us

Oh but it has (IMO).

> users today expect things like instant updates [...] constant bug fixes and patches

Nah, this is in reverse. With boxed software, the developer had to deliver an essentially bug-free product. Now, with easy updates technically possible, the developers have gone complacent, and deliver shit. That is why users expect bugfixes instantly. (And any enlightened user abhors unrequested features, as there are no features without regressions, and who wants regressions in any serious application?) The only tolerable online updates are security fixes.

> sync across devices, collaboration

This is a valid expectation, but its execution has been a train-wreck. Research, design and implementation should start with end-to-end encryption; the network architecture should be peer-to-peer (mesh, not centralized). What do we get instead? More centralization of control than ever, and less privacy and ownership than ever.

kristianc · 49m ago
Generally that's not how I remember it - third party software on the Mac at least got some kind of a beach-head because Windows software was full of bugs, crashes, corrupted files, drivers that never worked, and patch CDs mailed to enterprise customers like they were firmware apologies. Own your own software, taken to its logical endpoint, was a shareware nightmare.
flkenosad · 42m ago
> treat the software like a public good, absorbing the long tail of maintenance with no recurring income.

Good point. Governments would do this if they really worked "for the people"

dtkav · 2h ago
We need a term for a viable business model to pair with local-first tech.

I've been working on Relay [0] (realtime multiplayer for Obsidian) and we're trying to follow tailscale's approach by separating out the compute/document sync from our auth control plane.

This means thats users still subscribe to our service (and help fund development) and do authn/authz through our service, but we can keep their data entirely private (we can't access it).

[0] https://relay.md

jessmartin · 9m ago
Relay user here! It’s great. Quite reliable for an early product.
dtkav · 1m ago
Thanks for the kind words
hemant6488 · 2h ago
I've been building exactly this with SoundLeaf [0] - an iOS client for the excellent open-source Audiobookshelf server. No data collection, no third-party servers, just your audiobooks syncing directly with your own instance.

The user-friendliness challenge is real though. Setting up Audiobookshelf [1] is more work than "just sign up," but once you have it running, the local-first client becomes much cleaner to build. No user accounts, no subscription billing, no scaling concerns. Simple pricing too: buy once, own forever. No monthly fees to access your own audiobooks.

[0] https://soundleafapp.com

[1] https://github.com/advplyr/audiobookshelf

sunshine-o · 1h ago
Most of that stuff was very much over engineered in the last two decades.

The backend for my personal notes, tasks, bookmarks, calendar and feeds are files in directories synced with Syncthing across devices.

I ended there after going from one app to another and being tired of all this.

It is self hosted with no server backend (beyond a Syncthing on a NAS or VPS, optional). It is very reliable and works without Internet connection.

I could have put everything in sqlite too and sync it one way or another, but it seemed already too complicated for my requirements.

I can't share it beyond my close relatives but I had the same problem with people using Google or Microsoft before.

bhauer · 5h ago
I've been wanting a computing model I call PAO [1] for a long time. PAO would run personal application "servers" and connect dynamic clients across all devices. PAO is centralized, but centralized per user, and operating at their discretion. It avoids synchronization, complex concurrent data structures, and many other problems associated with alternatives. Its weakness is a need for always-on networks, but that complication seems ever easier to accept as omnipresent networks become realistic.

[1] https://tiamat.tsotech.com/pao (2012)

chrisweekly · 1h ago
> "we have gone further than other projects down the path towards production-ready local-first applications based on CRDTs"

This seems like a bold claim, but IMHO Ink & Switch have earned their solid reputation and it wouldn't surprise me if it's true. I agree w/ their analysis and am philosophically aligned w/ their user-centric worldview. So who's going to build "Firebase for CRDTs"?

packetlost · 1h ago
> Firebase for CRDTs

Do you actually need anything special for CRDTs over a normal database? My understanding is the actual CRDT part is done "client side"

chrisweekly · 1h ago
I was just referring to the posted article's assertion that "Firebase for CRDTs" is a huge opportunity. I think I agree w the authors that a well-architected CRDT solution for local-first apps requires capabilities not currently provided by Firebase or any other vendor. But I'm no expert.
cheema33 · 4h ago
The primary challenge with building local first software is the sync layer. The current 3rd party offerings are not mature. And people have been working on these for a few years. Electric SQL comes to mind.
coffeecoders · 2h ago
Lately, I have been following this approach and going towards local-first software. I like simple softwares with barebone features.

- Password manager: KeyPassXC

- Notes: Logseq

- Analytics: Plausible

- Media: Jeyllyfin

- Uptime kuma

- Finance tracker: Actual Budget etc is too heavy so I built this. https://github.com/neberej/freemycash/

- Search: Whoogle? is kinda dead. Need alternative.

Incipient · 5h ago
The data part aside, and specifically on the platform/functionality side - these cloud/large products unfortunately do offer more powerful/advanced features, or convenience. Be it cloud multi-device functionality that makes moving around and collaborating seamless, or to enterprise products like snowflake and fabric that offers all sorts over a standard mssql db.

I'm personally very against vendor lock in, but there is some value to them.

outlore · 4h ago
What are the top web local first frameworks worth checking out these days? i’ve heard of livestore, tanstack DB with electric, zero. any others that are easy to use and flexible? use case is multiplayer apps and maybe games. thanks!
Existenceblinks · 2h ago
Tried to adopt this last month at work, it failed. E.g. the mentioned Automerge, it has poor docs https://automerge.org/docs/reference/library_initialization/... and that left out a lot of question, it seems backend agnostic but have to figure out how to store, how to broadcast ourselves.
jes5199 · 31m ago
yeah I tried to build a project on Automerge but I ended up switching to Yjs, it seems more mature.
danjl · 2h ago
Goal #2, your data is not trapped in a single device is the hard bit, especially with goal #3, the network is optional. For #2 to be true, this means the network is *not* optional for the developer, it is required. Thus the entire complexity of building a distributed app, especially one without a centralized server, which is particularly difficult even with modern local first database tools, greatly increases the complexity of writing this type of software compared to either traditional desktop apps or cloud apps.
thenthenthen · 2h ago
Didn't this already happen? The internet died 20 years ago. Now it is just ‘somewhat’ interconnected intranets with their own local legislation?
miladyincontrol · 2h ago
In a world of owning nothing and paying subscriptions for everything, owning your data and using software that is either yours or libre is 'rebellion' to many a service provider.

Its not local-first or some sort of cloud diet trend, it should be the norm.

OjotCewIo · 1h ago
Right. I don't even understand why this article had to be this verbose. It's not like we need to be "convinced" that local is better. Everybody who values privacy and independence knows already. But this stuff is unimplementable -- we suffer from the cloud disease because it's immensely profitable for the cloud providers and cloud-based app providers to enslave us, and to bleed us out. Their whole point is locking us in.

"Sharing models" are totally irrelevant until the concept of self-determination is tolerated by the powerful (and they will never tolerate it). Accessing my data from multiple devices is totally secondary; I don't trust mobile endpoints to access my remote data in the first place.

sygned · 3h ago
I've made a local first, end-to-end encrypted, auto sync bookmark extension that doesn't milk your data in any way. It's 100% private, I even don't use Google analytics on my website. Some of the reasons why I've put some work into this is:

  - because I could not find something similar that doesn't milk and own my data
  - to never lose a bookmark again
  - to have my bookmark data encrypted in the cloud
  - to have private history
  - to have some extra time saving features in the extension that are for unknown reason rare to find
  - more learning and experience (it's acutally quite complex to build this)
After about 4 years of using it daily on every pc I own, I found out it's a pain for me and my family when it is not installed on a browser. I thought; if it's useful for us, it might be useful for others too! So, I decided to make it available by subscription for a small fee to cover the server and other costs. I'm not really into marketing, so almost no one knows it exists. You can find it on markbook.io.
alganet · 1h ago
Offline-first, now with CRDTs, and a brand new name!
neon_me · 4h ago
100%! Not only local-first. But also private, zero/minimal dependency, open source and environment agnostic!

If there is anyone interested in working on such projects - let's talk! We can't leave our future to greedy surveillance zealots.

jumploops · 5h ago
One thing I’m personally excited about is the democratization of software via LLMs.

Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.

Getting a user to install a local DB and a service to run their app (god forbid, updating said service), is a challenge that’s complex, even for developers (hence the prevalence of containers).

It will take some time (i.e. pre-training runs), but this is a future I believe is worth fighting for.

moffkalast · 5h ago
Local LLMs are even more amazing in concept, all of the world's knowledge and someone to guide you through learning it without needing anything but electricity (and a hilariously expensive inference rig) to run it.

I would be surprised if in a decade we won't have local models that are an order of magnitude better than current cloud offerings while being smaller and faster, and affordable ASICs to run them. That'll be the first real challenger to the internet's current position as "the" place for everything. The more the web gets enshittified and commercialized and ad-ridden, the more people will flock to this sort of option.

hkt · 4h ago
Self hosting (which is often adjacent to local-first software) is fine. I've done it for years.

But it is a nightmare when it goes wrong: the conclusion I've reached is that it is out of reach to regular people who don't want the Byzantine support load that could accompany something going wrong. They want turnkey. They want simple. They aren't interested in operating services, they're interested in using them.

The FLOSS model of self hosting doesn't really offer a reliable way of getting this: most businesses operating this way are undercapitalised and have little hope of ever being any other way. Many are just hobbies. There are a few exceptions, but they're rare and fundamentally the possibility of needing support still exists.

What is needed, imo, is to leverage the power of centralised, professional operations and development, but to govern it democratically. This means cooperatives where users are active participants in governance alongside employees.

I've done a little work towards this myself, in the form of a not-yet-seen-the-light-of-day project.

What I'd love to see is a set of developers and operators actually getting paid for their work and users getting a better deal in terms of cost, service, and privacy, on their own (aggregate) terms. Honestly, I'd love to be one of them.

Does anyone think this has legs to the same extent as local-first or self hosting? Curious to know people's responses.

ibizaman · 1h ago
This is the business model I want to have: I work on a stack of fully open source software and package them in a turn-key server that you own. You can use it on your own for free if you’re knowledgeable and I offer a subscription where I’m the sysadmin of the box you own and that I built for you. I do the maintenance, the updates, etc. There’s no lock-in because you can stop the subscription anytime or even just pick another sysadmin that would know the stack. The only reason you’d keep me around would be that the service I offer is pretty damn good. Would something like that appeal to you?
mxuribe · 4h ago
I was about to suggest that a better, more open, and fair form of capitalism would need to be used as a tool...but then, re-reading your comment - "...leverage the power of centralised, professional operations and development, but to govern it democratically..." - i think you better encapsulate what i meant to convey. :-)

That being said, yes, i do believe *in the near/upcoming future* local-first, self-hosting and i will add more fair open source vendors will work! Well, at least, i hope so! I say that because Europe's recent desire to pivot away from the big U.s. tech companies, and towards more digital sovereignty - in my opinion - begins the foundational dependency for an ecosystem that will/could sustain self hosting, etc. The more that europe is able to pivot away from big tech, the more possibilty exists for more and varied non-big tech vendors manifest...and the more that Europe adopts open source, the more the possibility that usage and expertise of self-hosting grows....plus, for those who do not know how to, or simply do not wish to manage services themselves...well, in time i think Europe will have fostered a vast array of vendors who can provide such open source, digital services, but get paid a fair cost for providing fair value/services, etc. ...and, by the way, i say this all as a biased person in favor of open source AS WELL AS being an American. :-)

OjotCewIo · 1h ago
> What is needed, imo, is to leverage the power of centralised, professional operations and development, but to govern it democratically. This means cooperatives where users are active participants in governance alongside employees.

Utopia. Unattainable. Self-determination of the individual has been consistently persecuted under all societal arrangements; communism and capitalism equally hate a citizen that wants to remain independent and self-sufficient.

didgetmaster · 5h ago
Databases like Postgres can be run locally or as part of some kind of managed service in the cloud. Anyone know of recent stats that show the percentage of databases that are managed locally vs by some cloud service?
curtisblaine · 1h ago
With crdt implementations like y.js, writing your own synchronization engine is trivial: https://greenvitriol.com/posts/sync-engine-for-everyone
lutusp · 4h ago
Complete agreement. Here's a brief, practical action plan for Windows users:

  * Download all your data from Microsoft's "OneDrive" cloud storage, which if not disabled, is the default storage method in a new Windows install.
  * Verify that all your files are now stored locally.
  * Click the gear icon, go to "Settings -> "Account" -> "Unlink this PC," right-click, "Unlink account".
  * Remove Microsoft's OneDrive app from your system -- full removal is the only way to prevent perpetual harassment and reactivation. Go to "Apps" -> "Apps & features" (or "Installed apps" on Windows 11) -> "Microsoft OneDrive", right-click, "Uninstall."
  * Optional extra step: cancel your Microsoft 365 subscription and install LibreOffice (free, open-source).
Remember this -- cloud storage only has advantages for Microsoft and law enforcement (which have a number of easy ways to gain access to your documents compared to local storage). For a Windows user, cloud storage is the ultimate Dark Pattern.
3cats-in-a-coat · 3h ago
How about redundancy in general. Not local first, not cloud first, but "anything can be first and last". That's how the "cloud" works in the first place. Redundancy. Mesh networks as well.
cyanydeez · 5h ago
Local first is almost equates to both privacy protective and public software good.

Essentially antithetical to capitalism, especially America's toxic late stage subscription based enshittification.

Which means its typically a labor of love or a government org has a long term understanding of Software as a Infrastructure (as opposed to SaaS)

bigyabai · 3h ago
"Local first" is neither equivalent to privacy protection or public software good. Many businesses sell local-first software that still contains remote backdoors[0] you cannot control. And it most certainly doesn't ensure "public software good" when there is zero obligation to improve the upstream or empower users to seek alternatives.

I would sooner trust a GPL-licensed remote software program than store a kilobyte of personally identifying information in a proprietary "local first" system.

[0] https://www.macrumors.com/2023/12/06/apple-governments-surve...

Nevermark · 5h ago
I think you mean antithetical to corrupted conflict-of-interest capitalism.

Conflict-of-interest transactions have hidden or coercive impact, lined up in favor of the party with stronger leverage. Examples include un-asked and unwanted surveillance of data or activity, coercive use of information, vendor lock in, unwanted artificial product/service dependencies, insertion of unwanted interaction (ads), ...

None of that is inherent to capitalism. They clearly violate the spirit of capitalism, free trade, etc.

It is providers taking advantage of customer lack of leverage and knowledge to extract value that does not reflect the plain transaction actually desired by customers. Done legally but often with surreptitious disclosure or dark pattern permissions, border line legally where customers would incur great costs identify and protest, or plain old illegally but in a hidden manner with a massive legal budget to provide a moat against accountability.

It is tragic that the current generation of Silicon Valley and VC firms have embraced conflict of interest based business models. Due to the amounts of money that scaling "small" conflicts can make. Despite the great damage that we now know scaling up "small" conflicts can do.

That was not always the case.

nicoburns · 5h ago
The problem with our current system of capitalism is that it causes capitalism to accumulate. This leads to less competition, fewer checks and balances, and undermines the whole "wisdom of the crowd" mechanism that captialism is premised on.

If we want a functioning market based system then we need to explicitly correct for this by aggressively taxing the wealthiest entities (individuals and companies) in our society to bring things closer to a level playing field.

ndr · 5h ago
It might be antithetical to rent seeking at best, but capitalism?