As an early adopter and developer of a couple of service oriented capsules, as time went by my interest faded completely. I'm a strong advocate of live and let live, so this is not a critique or discouragement post, but rather my own perspective.
Like many have mentioned already, I personally would have preferred pure markdown and no gemtext at all. Similarly, and although I understand the reasoning behind making encryption mandatory, I believe it should be optional in the spirit of KISS. I'm more of a minimalist than I am a privacy evangelist. In this regard, I felt a bit out of place within the gemini community.
Finally, the argument that it takes a new protocol to avoid a broken user experience, often exemplified by someone jumping from a simple and well behaved HTTP website into a chaotic one, doesn't resonate much with me. Again, I get it, but I can live with visiting only the websites or gopherholes I want. This comes with a great advantage. Even if we consider just the minimalist and well designed websites, this means hoards of content when compared to all gemini capsules. I missed a broader set of topics when I used gemini and ultimately that was what killed my interest.
All that said, I loved it while I used it and I stumbled upon some really nice people. Maybe I'll fall in love again one day...
gluon
ndiddy · 3h ago
> Even if we consider just the minimalist and well designed websites, this means hoards of content when compared to all gemini capsules. I missed a broader set of topics when I used gemini and ultimately that was what killed my interest.
This is definitely Gemini's biggest weakness. I looked around on it a bit when it was gaining attention, and most of the sites I saw were just complaints about how bloated the modern web had become. I get it, but it's kind of treating the whole thing as a novelty rather than an actual medium that can be used to convey information. It didn't have the wide and varied userbase that even the mid-90s academic web they were trying to replicate had. It kind of reminded me of all the people who write a static site generator for their blog, and then only write a single blogpost about how they made their static site generator.
rollcat · 3h ago
> Again, I get it, but I can live with visiting only the websites or gopherholes I want. [...] Even if we consider just the minimalist and well designed websites, this means hoards of content when compared to all gemini capsules.
I agree. Personally, I'm a fan of progressive enhancement.
E.g. I use this Hugo partial to hide emails; it de-obfuscates an address using JavaScript, and falls back to printing a shell command:
Similar for CSS, although that one is a forever WIP...
gcarvalho · 1h ago
I can attest that CSS is very effective for obfuscating e-mail.
I displayed my academic e-mail on my webpage for over half a decade using CSS to flip the text direction[1] without getting significant spam.
This kind of approach is exactly why I believe we can have a nice experience over HTTP. Progressive content enhancement nails the perfect balance between too simple and too bloated. I personally believe client side scripting is important and ideally should be used sparingly. Your example illustrates a perfectly reasonable use case where JavaScript makes sense, yet still providing a scriptless alternative that solves the problem. Nice stuff.
RiverCrochet · 3h ago
The thing I liked about Gemini and its self-imposed limitations is that it was very much impossible to create a misbehaving Gemini document. There is no way a Gemini browser will phone home, run malicious code on my side, grab/upload my browser history or send sensor or other data because I forgot to turn off various options, etc. To me the entire thing was much more trustworthy.
You can of course recreate this experience using HTTP and modern browsers, but both are so complicated that you don't know what's really happening without a lot of work.
jug · 1h ago
This should really be a more common feature in web browsers. Yes, it can be achieved by turning off JavaScript and so on but it should be a feature like Incognito mode where you have either a high visibility toggle button, or open tabs in this mode, where tabs with the same kind as the parent keeps being opened in this mode. That way, you’d have Gemini for the regular web just by making websites that don’t break when that kind of code is disabled.
vascocosta · 2h ago
I liked that as well, but wouldn't remember it before reading your comment. I guess all in all it is a pretty nice protocol, the only real problem for me is that the content is too niche to appeal to me on a daily basis.
floren · 1h ago
The thing about Gemini is that it reads like someone who thinks Gopher sounds neat but has only ever dealt with HTTP and HTML/Markdown... so they took HTTP GET, chopped a digit off the response codes, and called it a new protocol, then tacked on an intentionally-broken Markdown implementation (more broken than the original Markdown, I mean).
Interesting note: the first line of a Gemini response is a MIME type. It's usually `text/gemini` but there's no reason it can't be `text/html`, `application/javascript`, or anything else. A while back I did a little poking in some Gemini server code and made it do precisely that: serve HTML files which I accessed via elinks. Of course once you're serving HTML over Gemini you might ask, exactly what advantage am I getting by putting it over a purposefully-broken subset of HTTP, and I would say that's a damn good question.
In 2024 I wrote 'The modern Web and all its crappiness didn't come about because there's something inherently wicked in HTML and HTTP, it came about because people built things on top of the basic foundation, extending (sometimes poorly) and expanding. The more people play with Gemini, the more they'll want to "extend" it... and the closer they'll bring it to HTTP, because it follows the exact same fundamental model once you strip off the extraneous document format specification' and I stand by it.
NoboruWataya · 9h ago
I'll add my name to the list of people who like the idea and were very curious about it when they first heard about it but now don't think about it as much.
It's very fun to develop for. The simplicity of the protocol means that writing a server, client or "web app" (equivalent) is a weekend project. So there is a proliferation of software for it but that doesn't necessarily translate into content.
There is content, though. My favourite aggregator is gemini://warmedal.se/~antenna/ and I do still drop by there regularly enough to have a browse. It's no longer all meta content which is good (people used to just use Geminispace to write about Gemini). It's still quite tech/FOSS focused, unsurprisingly.
I agree with the other comments that are saying that a simple markdown would have been better than gemtext.
Whenever Gemini gets mentioned on HN there are a lot of commenters who seem to have an issue with the "views" or "values" of some people within the community. They never go into detail. I can honestly say I'm not sure what the issue is. As a very middle-of-the-road centrist I have never had much of an issue with the content I find on Gemini. Sure, you had a few interesting "characters" on the mailing list (back when it existed) but they were a minority and it was nothing you don't also find on the web. I guess people there tend to be more dogmatic about sticking to FOSS and keeping the internet non-corporate, which can rub people the wrong way, but again you can find similar views on the web (and IMO it makes for interesting discussions even if I don't agree with the dogmatism).
rickcarlino · 3h ago
Regardless of the technical shortcomings of the protocol, a grassroots group of individuals has managed to create a viable new network protocol. The user base is small, but it is not tiny and it is not nonexistent and it has been going for years now. You can download a Gemini client and find regularly updated blogs on a Gemini search engine. You can have discussions on Gemini applications like Antenna or Station. It has managed to solve many of the problems it intended to solve (privacy, resource bloat, protocol specification bloat, etc.).
throwaway328 · 20m ago
What did it do for privacy?
I think Gemini is great, and read from Nyxt browser. Don't know if I've seen any references to privacy benefits, so curious.
rickcarlino · 14m ago
I’m pretty sure they talk about privacy directly in the spec, but I haven’t read the spec in years. Going off the top of my head, they had the decision to not include things like user agent headers or anything that resembles a cookie specifically with the goal of preserving privacy. There is also the obvious point of the protocol not supporting raw TCP sockets and requiring encryption by default.
rcarmo · 10h ago
I wrote a server for it a while back (am still running it someplace behind a CF tunnel) but I’ve never really found either the community or the protocol were taking off:
A key issue with the ecosystem (not the protocol) as far as I’m concerned is that it would have been stupendously better to settle on Markdown (even a simplified form) for content creation. The rest is OK, I guess, but it’s just a nuisance to maintain “dual format” sites.
(I see a few comments here about the community’s opinions and leanings, but to be honest it’s not any weirder than your average old-timely IRC channel or fringe Mastodon server—-live and let live, read what you want and just skip Antenna listings you don’t like)
uamgeoalsk · 9h ago
Being able to parse gemtext line by line with almost no context is a big win for simplicity - you can't really do that with markdown.
Avshalom · 5h ago
It's not like by line but djot was designed to be parsed easier/more efficiently than markdown while being basically as featureful and ergonomic.
blueflow · 8h ago
It is possible if you restrict yourself to an subset of markdown. It works pretty well, actually, i have two awk scripts that take in a subset of markdown and generate either HTML or LaTeX.
Etheryte · 6h ago
Pure ascii text is also a subset of markdown, so it doesn't really say much that it works for a restricted subset.
uamgeoalsk · 8h ago
Sure, that's fair! In any case, I personally prefer the aesthetics and the readability of gemtext to markdown (especially when it comes to links!)
fallat · 5h ago
Uh, how much simplicity do you really gain? What's an instance of needing to backtrack?
hombre_fatal · 3h ago
You can have "[click me][1]" at the top and then "[1]: https://example.com" at the bottom. You wouldn't be able to render the link until the whole document was downloaded.
fishgoesblub · 12h ago
It was sad seeing the hate for it on here when it was new-ish, and while I haven't used it in a while, I'm glad to see it still kicking around. Such a neat and fun project.
Karrot_Kream · 11h ago
Its a fun concept but the community around it had a strong tendency to want to proselytize their values to you. I enjoyed playing around with it in the beginning but it introduced me to too many tech preachers each with their own similar but slightly different philosophies that they felt like I must know about.
It may have changed but that's what largely turned me off from it. I find other networking projects to have a less preachy mix of people.
_Algernon_ · 10h ago
The selection effects of people seeking out something like this are probably intense, but that was also true for the early web, and is what people liked about the early web.
rickcarlino · 3h ago
The biggest thing I’ve seen is outsiders who find protocol issues to be non-starters. It’s certainly not a perfect protocol, but neither is HTTP or email. It works and I’m happy that we have another option for hypermedia.
There were people questioning decisions in relation to goals they were projecting onto the project, but which were not actually the goals of the project. That often resembles 'hate'.
sneak · 12h ago
I don’t hate it, but I question the use. You can use HTTP to do what it does, but better.
The modern web is opt-in. I build and use sites that aren’t SPAs and shitted up with 3p resources and images and code.
HTTP is great, and deserves our time and attention. I get that they seem upset with the modern web, and I am too - but it isn’t HTTP’s fault. It’s the sites you visit.
If you want to build new and smaller communities, I really think we should be building for browsers. Perhaps a website design manifesto is in order.
treve · 11h ago
The limits of the medium and the creativity this enforces is why people like it. It caters a niche audience with a shared set of values. I get why people don't really care for it personally or on a technical level (myself included), but it always surprises me that it's hard for people to understand that others do.
jay_kyburz · 7h ago
I agree the limitations are what makes the platform great, but I really wish they had included a simple image block in the spec.
Text only is just a little to limiting if you ask me.
rainingmonkey · 1h ago
Many browsers will render a link to an image as an embedded image block!
mattkevan · 11h ago
I’ve been exploring this problem for a while, and have been building something which I think might help solve it.
I’m currently building a browser-based static site generator that produces clean, simple code. But it does more than that.
Alongside the generated HTML, sites also publish their public configuration and source files, meaning they can be viewed in more than just a browser, for example in a CLI or accessibility device.
The client interface is also more than a CMS - you’ll be able to follow other sites, subscribing to updates, and build a network rather like a webring. The idea is to provide human-powered discovery and community tools. The reach may be less than if algorithmic, but it’s designed for genuine connection, not virality.
As the client is smart but sites are simple, sites can be hosted on anything, from the cheapest shared host up.
I’d be happy to talk further if that’s interesting in any way.
CJefferson · 4h ago
In terms of levels of current support, you would be hard-pressed to find anything better for accessibility than simple, well-formed HTML. It's better even than plain text.
rglullis · 10h ago
That sounds a bit like the dat browser, no?
mattkevan · 9h ago
Not familiar with dat browser, but I'll take a look.
> The modern web is opt-in. I build and use sites that aren’t SPAs and shitted up with 3p resources and images and code.
That is a microscopic subset of the modern web.
I don't use Gemini— though I am highly tempted —but I expect some of the attraction is that you can click on any link and pretty much guaranteed not to be socked in the face with a sign-up-for-my-newsletter or cookies-accept-all-yes-later or paragraph-ad-paragraph-ad-paragraph-ad or fiddling with NoScript to find the minimum amount of Javascript that will let you read some article that looks interesting. In Gemini, all that friction just goes away.
sneak · 10h ago
You can achieve that on HTTP with a browser extension or customized browser that checks for certain tags in the page, or disables certain features altogether. It isn’t the transport’s fault.
bayindirh · 10h ago
With all respect, this viewpoint rivals the infamous Dropbox comment.
rglullis · 9h ago
Problem: you are looking for a way to get rid of the annoying issues of the modern www. What is the solution that solves this with the least amount of work?
A) Develop a whole new transport protocol that does less than HTTP, develop client applications that use this protocol, convince a sufficient number of people to use this protocol, at least to the point where the majority of your activity happens there?
or
B) Install a handful of browser extensions that block ads and other nuisances on the modern www, and have it working right away?
bayindirh · 7h ago
Option “B” implies a cat and mouse game, which you can never win.
You can’t win a game designed and implemented by a mega corporation which is specially made to earn them money and protect their monopoly by being reactive and defending all the time. Instead you have to change the game and play with your own rules.
That’s option “A”.
rglullis · 6h ago
> Instead you have to change the game and play with your own rules.
That only works if you can convince the a substantial part of the participants to also play your game.
It's very easy to create an alternative internet where we can take away the power from incumbents. The hard part is creating all the activity that is taking place in the current one.
"Oh, but I can mirror the parts I want from the current internet into the new one!"
Not without playing into the same cat-and-mouse game.
bayindirh · 5h ago
Who says I'm trying to pull in everyone from the old internet to the new internet (Gemini)? If the people I care comes along, that's enough for me, and it's up to them.
For example, I switched to Mastodon, and follow people who I really want to follow are already there, plus I met a ton of interesting people, and was able to see real forms of people I followed before, so I have updated my views on them.
> "Oh, but I can mirror the parts I want from the current internet into the new one!"
Personally, I see Gemini or other protocols as equals to HTTP/S. For example, my blog is already text in most cases, has a full content RSS feed, so, also publishing a Gemini version is not mirroring what's on the web already, just adding another medium to my blog.
If I was pumping a 3rd party site I don't own from web to Gemini with a tool, then you'd be right, but publishing to Gemini is not different than having a RSS feed in my case.
rglullis · 4h ago
> For example, I switched to Mastodon (...) and was able to see real forms of people I followed before, so I have updated my views on them.
Isn't that strong evidence that it is possible to have a "human-scale" web built on HTTP, and consequently that there is not much benefit in restricting yourself to a protocol that is designed to be limited?
> Personally, I see Gemini or other protocols as equals to HTTP/S
Except they are not. Maybe it can do enough of the things that you care about, but Gemini is (by design!) meant to do less than HTTP.
> publishing to Gemini is not different than having a RSS feed in my case.
Again: if all you want is to be able to publish something in a simple format, then why should we care about the transport protocol?
I get the whole "the medium is the message" idea, I really do. I get that people want a simpler web and I look forward to a time where we have applications developed at a more "human scale". But I really don't get why we would have to deliberately be stripping ourselves of so much power and potential. Talking about Gemini as the best solution to the problems of the modern web is like saying we should wear chastity belts to deal with teenage pregnancies.
bayindirh · 3h ago
Yes, but it's important to understand that limitations are moved to Mastodon "layer" in that case. It takes careful deliberation and restraint to keep something tidy. Mastodon does this by limiting its scope and organizational structure. We as humans like to abuse capabilities. So, to keep something tidy and prevent (or realistically slow down) rot, you need a limit somewhere. Putting that limit to humans vs. the protocol is a trade-off.
In that scenario W3C doesn't put any brakes, Mastodon puts brakes on development, organizational structure and scope, and Gemini puts brakes on the protocol. So, it's the most limited but hardest to abuse in a sense.
I probably worded my "I see them as equals" part of my comment wrong. I know Gemnini is a subset of HTTP, it's more Gopher than HTTP, and that's OK by me. Moreover, that leanness is something I prefer. See, the most used feature on my browser is Reader mode, and I amassed enormous amount of links in Pocket just because of the reading experience it offered.
> I really don't get why we would have to deliberately be stripping ourselves of so much power and potential.
Because power corrupts and gets abused. A friend of mine told me that they now use Kamal which makes deployment easy. How it's deployed? Build a container -> Push to registry -> pull the container on the remote system -> runs the container -> sets up and runs a proxy in front of that container to handle incoming connections.
That's for a simple web application...
I mean, I push files to a web server and restart its process. I'm not against power, I'm against corruption, and given human nature, restraint is something hard to practice, and that's if you want to learn and practice it.
> Talking about Gemini as the best solution to the problems of the modern web is like saying we should wear chastity belts to deal with teenage pregnancies.
I never said Gemini is the only and the best way forward. Again, for me It's another protocol, which offers a nice trade-off for some people sharing a particular set of values. It's like a parallel web like BBSes or public terminals (e.g.: SDF).
Being an absolutist benefits no one. We should learn, understand and improve. We can have multiple webs, and we shall be free to roam them in a way we like. I'd love to use my terminal to roam some text only web with my favorite monospace font and terminal theme, but I like to write comments here and think on the replies I get, too.
I find myself preferring a text-only, distraction-free web more and more, and naturally evolving my habits and personal infrastructure in that way, but I'm not carrying a flag, shouting about end-times and preaching stuff as savior. I'm not that person.
rglullis · 2h ago
> it's important to understand that limitations are moved to Mastodon "layer" in that case.
Mastodon may be my preferred social network nowadays, but it's despite the prevalent philosophy from the development team. It's also arguably the reason that the Fediverse can not manage to grow to more than 1 million MAU.
>Because power corrupts and gets abused
The solution to this is not to get rid of power and keep everyone in the same small crab bucket. It's to make access to the powerful tools as universal and ubiquitous as possible.
> I push files to a web server and restart its process.
Your friend not being sensible enough to know when to use a tool vs when to keep it simple is not a problem of the tool. Also, talking about deployment methods seems so orthogonal to the discussion that I am not sure it makes sense to carry this conversation further.
sneak · 5h ago
Not really. You could have tinyweb/oldweb sites identify themselves with a meta tag, and have a browser that only browses those. A opt-in, web-within-a-web. And turns off js, cookies, and images.
You don’t need another transport protocol.
Gormo · 2h ago
How do you stop users who aren't using the custom browser from accessing these 'tinyweb' HTTP sites? How do you prevent content scrapers and search indexers from accessing them? How do you suppress direct incorporation of 'mainstream' web content into 'tinyweb' content?
If your goal is precisely to create an parallel ecosystem that's "airgapped" from the mainstream web, and you're already going to have to develop custom clients, content formats, and server-side configuration to implement it on top of HTTP, and engage in lots of development work to imperfectly isolate the two ecosystems from each other, why wouldn't you just develop a parallel protocol and start with a clean slate?
rglullis · 1h ago
> How do you prevent content scrapers and search indexers from accessing them?
How do you that with Gemini?
> If your goal is precisely to create an parallel ecosystem that's "airgapped" from the mainstream web
There is no way you can have an air gapped network with public access. The moment this "parallel ecosystem" showed any content that hinted at something lucrative, you will have people creating bridges between the two networks. Case in point: Google and USENET.
Gormo · 57m ago
> How do you that with Gemini?
You keep it isolated from the ecosystem in which all of those things are taking place.
> The moment this "parallel ecosystem" showed any content that hinted at something lucrative, you will have people creating bridges between the two networks. Case in point: Google and USENET.
The whole point is to minimize the chance of that happening -- by limiting mainstream appeal, keeping it a niche, and avoiding Eternal September -- and to maximize the friction of bridging these two ecosystems. And so far, they've done a fairly good job of it, since Gemini has been expanding for six years without any indication of any of this starting to happen.
rglullis · 50m ago
> and to maximize the friction of bridging these two ecosystems.
There is no friction. It's trivial to write a program that can scrape a Gemini network.
If there is no one pulling data from Gemini servers yet, is not because it's difficult do it, but merely because it's still too small to be relevant.
bayindirh · 5h ago
We have Kagi Small Web and Marginalia already, if that's your aim.
shakna · 7h ago
Considering "B" is becoming less possible, thanks to Google dropping Manifest 2, and going out of their way to enforce a lot more tracking, "A" looks like a lot less effort - you don't have to fight FAANG.
rglullis · 6h ago
Chrome is not the only browser out there. Firefox is still a good browser. If you depend on Chromium: Brave is keeping Manifest v2 and their ad-blocking extensions work out of the box.
shakna · 4h ago
And HTTP is not the only protocol out there. Plenty of others exist. Like Gemini, that has multiple browser implementations.
What's your point, exactly?
rglullis · 4h ago
My point is that the choice of protocol (much like the browser) is not a relevant factor if your goal is to be able to participate in the www without dealing with the issues.
We can have all the upside of an http-based web, without dealing with the downsides. The converse is not true. A Gemini network is by design limited in functionality, which is a downside that can not be mitigated.
Gormo · 3h ago
> My point is that the choice of protocol (much like the browser) is not a relevant factor if your goal is to be able to participate in the www without dealing with the issues.
Right, but that isn't the goal of Gemini. It's goal is to create a distinct ecosystem, not to participate in the existing one with marginally less annoyance.
rglullis · 2h ago
Even worse! This makes the whole proposal even more misguided.
Different ecosystems only make sense when we have distinct populations that might be as well considered different species.
Gormo · 1h ago
There's no 'proposal' here -- this is a review of an active ecosystem that has already had its ideas implemented and iterated on for the past six years.
Having a different ecosystem is the exact intention of this project. If that's not for you, you're certainly not required to participate, but the world is a vast continuum of variation, and is full of niches and clines that are intentionally distant from the global mean. Complaining that non-mainstream stuff exists seems pretty nuts to me -- the world is full of 'distinct populations'.
rglullis · 1h ago
> a vast continuum of variation, and is full of niches and clines that are intentionally distant from the global mean.
But they are all sharing the same world. It's all the same ecosystem.
My objection is not because I am against people trying to do something different. My objection is to this delusional idea that this work needs to be isolated from everyone else. It's sterile at best and elitist at worst.
fiverz · 5h ago
It's not FAANG anymore, it's GAYMMAN now
uamgeoalsk · 9h ago
What's more fun? Definitely A.
rglullis · 8h ago
You are not solving the stated problem. You are just admitting that working on a new protocol is a masturbatory, "the journey is the reward" kind of exercise.
uamgeoalsk · 1h ago
I'm not aiming to solve the stated problem, I'm having fun with gemini.
Gormo · 3h ago
The answer is "A". Perhaps some people are avoiding saying this too explicitly because it might sound a bit elitist, but I'll put how I see it as frankly as possible for the sake of clarity.
Gemini is not trying to solve a technical problem with the web. Is trying to solve a cultural problem that arises from the web having become a mass medium, in which every site's focus gradually erodes under pressure to optimize to the lowest common denominator.
Creating a new protocol from the ground up, and requiring users to install a distinct client to access it, isn't just about keeping the software aligned with the project's goals, it's about creating a sufficient threshold of thought and effort for participation that limits the audience to people who are making a deliberate decision to participate. It's about avoiding Eternal September, not about creating a parallel mass-market competitor to the web.
It's not about blocking the annoying ads, popups, and trackers, just to access sites where the content itself is full of spam, scams, political arguments, LLM slop, and other assorted nonsense, and instead creating an ecosystem that's "air-gapped" away from all that stuff, filled with thoughtful content produced deliberately by individuals. It's about collecting needles together away from the hay.
_Algernon_ · 7h ago
The benefit with A is that it also removes higher order effects of the modern web. You may for example remove adverts by installing an ad blocker, but that wont change the incentives that advertising creates (eg. clickbait, engagement maximizing, etc.). With A you can guarantee that the content is not shaped by these incentives.
rglullis · 6h ago
> With A you can guarantee that the content is not shaped by these incentives.
Without those incentives, you will quickly find out that there will not be much of an Internet out there.
If you don't believe me, check how many people are on YouTube talking about Open Source, when PeerTube exists and already can reach millions of people.
Gormo · 3h ago
> Without those incentives, you will quickly find out that there will not be much of an Internet out there.
Well, there is plenty of interesting content on Gemini. If you're OK with having 50% fewer needles in order to get rid of 99.999999% of the hay, then it's a win.
_Algernon_ · 5h ago
The internet and web existed for a long time before everything became infested with advertisement: Hobbyist bulletin boards, Wikipedia, the blogosphere, etc. These had enough content that a single person couldn't consume it all in a lifetime.
rglullis · 5h ago
That internet was also only interesting and valuable to a fraction of the people who use it today.
And if you don't care about that and you are thinking from what you might get out of it: an internet where 99% of the content is crap but universal will end up with more valuable content than a neutered internet that can prevent the emergence of crap, but is tailored to appeal only to 1% of the people.
IOW, no one cares about reading all of Wikipedia, and Wikipedia would never reach the size it has if it was something only for a handful of individuals obsessed about their particular hobbies.
No comments yet
_Algernon_ · 9h ago
HTTP is intermingled with a lot of the shitty SPAs, advertising and SEO of the web. You can make a simple text only site but the noise of the modern web is only ever a couple of clicks away. Gemini silos itself off. You know that a link you click will be an equally clean text-first site. To me that is the feature.
prmoustache · 7h ago
HTTP is great but with AI crawling being all over the place maybe it is not a bad idea to publish in a safe / niche place if one is not obsessed on how much people will be reached.
I am saying this but I have no idea if AI crawlers have started to crawl gem capsules.
Gormo · 4h ago
> I don’t hate it, but I question the use. You can use HTTP to do what it does, but better.
I'm not sure I understand that. HTTP is the fundamental protocol of the web. If your goal is to create an ecosystem that is deliberately set apart from the web, how would using the same underlying tech stack help rather than hinder you in doing that?
> HTTP is great, and deserves our time and attention. I get that they seem upset with the modern web, and I am too - but it isn’t HTTP’s fault. It’s the sites you visit.
And why are those sites so awful? Did they decide to become awful from the outset, or is it because they've gradually adapted to a userbase that has regressed to the mean due to the mass-market nature of the web?
The whole point of developing a new protocol is to create a non-negligible threshold of though and effort for participation, precisely so that it doesn't get popular quickly and end up subjected to Eternal September.
hombre_fatal · 3h ago
Though there are any number of nonstandard things you can do over HTTP to restrict your community from the unwashed eternal september noobs from joining it.
Requiring a markdown content-type would probably even be enough.
Consider the fact that TFA is already proxied over HTTP just so more than 3 people will read it, so it seems more sane to be HTTP native.
Gormo · 3h ago
> Though there are any number of nonstandard things you can do over HTTP to restrict your community from the unwashed eternal september noobs from joining it.
But why would you bother with that, when your whole goal is to create an ecosystem that's separate from the web in the first place?
> Consider the fact that TFA is already proxied over HTTP just so more than 3 people will read it. Seems more sane to be HTTP native.
Podcasts are often rehosted on YouTube, blog content is often reposted to social media, etc. Making content viewable from the other medium without making it native to the other medium is a common practice, and wouldn't defeat the purpose of trying to build a distinct ecosystem on top of the same foundation that underlies the ecosystem you're trying to avoid.
SoftTalker · 1h ago
> Podcasts are often rehosted on YouTube
I actually don't know of any other way to get them. I suspect I'm not alone. That's how pervasive the dominant platforms are.
Thanks. For this that don't want to have to click through:
Gemini is an application-level client-server internet protocol for the distribution of arbitrary files, with some special consideration for serving a lightweight hypertext format which facilitates linking between hosted files. Both the protocol and the format are deliberately limited in capabilities and scope, and the protocol is technically conservative, being built on mature, standardised, familiar, "off-the-shelf" technologies like URIs, MIME media types and TLS. Simplicity and finite scope are very intentional design decisions motivated by placing a high priority on user autonomy, user privacy, ease of implementation in diverse computing environments, and defensive non-extensibility. In short, it is something like a radically stripped down web stack. See section 4 of this FAQ document for questions relating to the design of Gemini.
monkeywork · 13h ago
Gemini is one of those things I see pop up and tell myself I should look into it more and then it fades into the back of my mind.
Anyone have any hints on getting more use out of it or ways to make it more present in my day to day.
b00ty4breakfast · 13h ago
NewsWaffle gemini://gemi.dev/cgi-bin/waffle.cgi/
takes a url to a regular webpage and spits out a gemtext version that is much more sparse and, for me, is much more readable.
it's honestly the only reason I still use gemini since the rest of it is abandoned gemlogs, rehosts of web content I don't care or ersatz social media
akkartik · 12h ago
Oh nice, a gateway in the opposite direction!
anthk · 4h ago
There are far more gopher phlogs than gemini gemlogs.
Still, both communities overlap of course.
Setting up a gopher phlog requires no TLS at all and any machine from 1980 (even maybe ITS
with a Gopher client written in MacLisp) will be able to read it with no issues.
rainingmonkey · 48m ago
Kristall[1] is my favourite browser, Antenna[2] is my favourite aggregator.
I made the first (and still popular) "social network" 4 years ago. Still going strong. More info: https://martinrue.com/station
tvshtr · 12h ago
it's this federated with anything?
dimkr1 · 8h ago
https://github.com/dimkr/tootik is another Gemini social network that does federate over ActivityPub, and I've been thinking about developing a minimalist ActivityPub alternative (maybe using Gemini and Titan to replace HTTP GET and POST, respectively) that can coexist with ActivityPub support
kstrauser · 2h ago
That’s brilliant, and exactly the sort of thing that could get me back into Gemini. Nice!
martinrue · 12h ago
No, it's as simple as it sounds. I should think about that at some point.
agumonkey · 13h ago
I love the frugality, overall project goals, and I'm used to non mainstream ergonomics (I enjoy gnu ed..). But something about Gemini search engines and browsing was unfit for me to keep using it.
You will need to prefix a gemini URL with "gem " if you're pasting it into the address bar.
debo_ · 3h ago
I like Gemini. My (anonymous) blog (gemlog) is posted there and has an http proxy, and is a no-brainer to maintain.
Once in a while I check Lagrange (Gemini "browser") for gemlogs I've subscribed to and catch up with what other anons are going through. It tends to be a lot more raw and persona-less than what I find on the web, which I appreciate. It's generally just cozy.
martinrue · 13h ago
Gemini for me has been such a breath of fresh air in contrast to 2025 Internet, with so many ads, grift and now AI slop.
Back when I first discovered Gemini, I wanted to create a space for people to have a voice without needing to run their own server, so I built Station (https://martinrue.com/station). I've been running it ever since.
Gemini in general, and specifically folk who use Station regularly, make it a friendly, throwback-to-the-90s vibe, and I still value it a lot.
Wild_Dolphins · 8h ago
Gemini is a beautiful idea.
However, it works on the basis of mandatory-prohibition. The prohibition is: "You cannot track and exploit your site visitors". This philosophy is enforced 'remotely', by the creators of the Gemini protocol.
An identical end-result can be achieved in HTML, by choosing not to use hostile markup. However, with HTML the prohibition must be enforced 'locally', by the ethical-philosophical position of the website-designer.
The problem with the Gemini-protocol is that it introduces an attack vector: The Gemini 'browsers' themselves. The most popular one is not audited; has a huge code-base; and has relatively few eyes-on-it.
I'm not saying that Gemini protocol is a honey-trap for those trying to exit the surveillance-internet; but if I was a tech-giant / agency profiting from the surveillance-internet, I would definitely write browsers for the Gemini protocol and backdoor them.
As a former "Don't be evil" company, it would be of great interest to me who was trying to exit my 'web'; how; and why :)
Food for thought...
zozbot234 · 6h ago
Gemini the protocol is still a bit mysterious to me. Why not use plain HTTPS and just serve a new text/x.gemini MIME type? Or even serve plain old text/html and enforce no-JS-or-CSS in the Gemini client.
mcluck · 3h ago
Part of the goals of making it so tiny, as far as I understand it, is that a normal person could reasonably implement the entire thing from server to client. Going full HTTPS and HTML is a bit of a lift for a single person in a short period of time
Gormo · 2h ago
Because then it would be using HTTPS, and would not be isolated from the web in the way Gemini intends.
thisisauserid · 6h ago
"1.1.1 The dense, jargony answer for geeks in a hurry
Gemini is an application-level client-server internet protocol for the distribution of arbitrary files"
If I were one, I would consider that to have been buried.
vouaobrasil · 10h ago
I love the idea, but it's just to fringe to use for me. But I will say that I think the internet was far better before Google search really became strong, and before the corresponding massive increase in SEO spam.
jmclnx · 3h ago
FWIW, I have moved my WEB site to Gemini an decided a little later to mirror it on gopher (for fun).
I find maintaining these 2 sites far easier then dealing with html and the *panels I need to use to upload to my old WEB site.
People who have never viewed Gemini are missing some decent content.
grep_name · 3h ago
I checked out gemini maybe four years ago? And I remember really liking the idea but struggling to find content. Got any tips?
Gormo · 2h ago
gemini://gemini.circumlunar.space/capcom/
gemini://skyjake.fi/~Cosmos/
jmclnx · 3h ago
try gemini://sdf.org/ and gemini://gem.sdf.org/
LAC-Tech · 12h ago
Does gemini have any concept of feeds or pub/sub?
I noted there were a few capsules that acted as a sort of hub for other peoples capsules. which suggested to me there was a way to automate it, and I might be able to make my own
These are some feeds and aggregators I have bookmarked
gemini://skyjake.fi/~Cosmos/view.gmi
gemini://warmedal.se/~antenna/
gemini://calcuode.com/gmisub-aggregate.gmi
gemini://gemini.circumlunar.space/capcom/
gemini://tinylogs.gmi.bacardi55.io/
gemini://sl1200.dystopic.world/juntaletras.gmi
snvzz · 9h ago
This is just somebody's "finished" pet protocol (author did not allow anybody to give input). Narcissism we should not enable.
I will stick to gopher, as it is mature and much friendlier to low spec / retro machines.
spc476 · 8h ago
The author very much did allow others to give input. The original protocol had single digit status codes, I was arguing for three digit codes, he compromised with two digit codes. It was my idea to include the full URL for requests, and for redirections. It's just that it wasn't easy, but he could be reasoned with. The only two hard lines Solderpunk had for the protocol was TLS, and single level lists (why, I don't know).
anthk · 4h ago
Gopher user there from texto-plano (and seldomly, SDF).
Gopher often sucks for 40x25 devices or mobile ones.
Yes, word wrapping, but everyone uses the 72 char limit
or even doesn't give a heck and I have to set
my own $PAGER calling fmt, fold or par before less.
On TLS, you are right. But I've got to build BearSSL and some
libreSSL for for Damn Small Linux. The 2.4 kernel one,
were ALSA was a novely and DMIX was hard to set,
the one you got with Debian Woody... with the bf24 at
the LILO prompt.
So, if DSL can run some BearSSL based OpenSSL-lite
client, a gemini client for it should be totally doable.
uamgeoalsk · 9h ago
That's not narcissism - it's just someone building something they enjoy and sharing it with the world. Do you have the same objections to fiction writers or songwriters?
It's totally fine to prefer gopher for its maturity (I'd vehemently disagree, but that's for another day) or compatibility with retro machines, but framing someone else's creative project as a character flaw is just rude.
ilaksh · 11h ago
I think Gemini is a step in the right direction for some things. I usually mention my "tersenet" ideas when I see Gemini. Now that we have the Gemini LLM and Claude etc. there is less excuse for me to not finish any real software demo for it. Maybe one these days I will make an actual client for part of it.
I think there is room for things like media and applications even on an idealized web. But they should not necessarily be combined along with information browsing and search into one thing.
Like many have mentioned already, I personally would have preferred pure markdown and no gemtext at all. Similarly, and although I understand the reasoning behind making encryption mandatory, I believe it should be optional in the spirit of KISS. I'm more of a minimalist than I am a privacy evangelist. In this regard, I felt a bit out of place within the gemini community.
Finally, the argument that it takes a new protocol to avoid a broken user experience, often exemplified by someone jumping from a simple and well behaved HTTP website into a chaotic one, doesn't resonate much with me. Again, I get it, but I can live with visiting only the websites or gopherholes I want. This comes with a great advantage. Even if we consider just the minimalist and well designed websites, this means hoards of content when compared to all gemini capsules. I missed a broader set of topics when I used gemini and ultimately that was what killed my interest.
All that said, I loved it while I used it and I stumbled upon some really nice people. Maybe I'll fall in love again one day...
gluon
This is definitely Gemini's biggest weakness. I looked around on it a bit when it was gaining attention, and most of the sites I saw were just complaints about how bloated the modern web had become. I get it, but it's kind of treating the whole thing as a novelty rather than an actual medium that can be used to convey information. It didn't have the wide and varied userbase that even the mid-90s academic web they were trying to replicate had. It kind of reminded me of all the people who write a static site generator for their blog, and then only write a single blogpost about how they made their static site generator.
I agree. Personally, I'm a fan of progressive enhancement.
E.g. I use this Hugo partial to hide emails; it de-obfuscates an address using JavaScript, and falls back to printing a shell command:
(Hopefully HN will preserve all the characters.)Similar for CSS, although that one is a forever WIP...
[1] https://superuser.com/a/235965
You can of course recreate this experience using HTTP and modern browsers, but both are so complicated that you don't know what's really happening without a lot of work.
Interesting note: the first line of a Gemini response is a MIME type. It's usually `text/gemini` but there's no reason it can't be `text/html`, `application/javascript`, or anything else. A while back I did a little poking in some Gemini server code and made it do precisely that: serve HTML files which I accessed via elinks. Of course once you're serving HTML over Gemini you might ask, exactly what advantage am I getting by putting it over a purposefully-broken subset of HTTP, and I would say that's a damn good question.
In 2024 I wrote 'The modern Web and all its crappiness didn't come about because there's something inherently wicked in HTML and HTTP, it came about because people built things on top of the basic foundation, extending (sometimes poorly) and expanding. The more people play with Gemini, the more they'll want to "extend" it... and the closer they'll bring it to HTTP, because it follows the exact same fundamental model once you strip off the extraneous document format specification' and I stand by it.
It's very fun to develop for. The simplicity of the protocol means that writing a server, client or "web app" (equivalent) is a weekend project. So there is a proliferation of software for it but that doesn't necessarily translate into content.
There is content, though. My favourite aggregator is gemini://warmedal.se/~antenna/ and I do still drop by there regularly enough to have a browse. It's no longer all meta content which is good (people used to just use Geminispace to write about Gemini). It's still quite tech/FOSS focused, unsurprisingly.
I agree with the other comments that are saying that a simple markdown would have been better than gemtext.
Whenever Gemini gets mentioned on HN there are a lot of commenters who seem to have an issue with the "views" or "values" of some people within the community. They never go into detail. I can honestly say I'm not sure what the issue is. As a very middle-of-the-road centrist I have never had much of an issue with the content I find on Gemini. Sure, you had a few interesting "characters" on the mailing list (back when it existed) but they were a minority and it was nothing you don't also find on the web. I guess people there tend to be more dogmatic about sticking to FOSS and keeping the internet non-corporate, which can rub people the wrong way, but again you can find similar views on the web (and IMO it makes for interesting discussions even if I don't agree with the dogmatism).
I think Gemini is great, and read from Nyxt browser. Don't know if I've seen any references to privacy benefits, so curious.
https://github.com/rcarmo/aiogemini
A key issue with the ecosystem (not the protocol) as far as I’m concerned is that it would have been stupendously better to settle on Markdown (even a simplified form) for content creation. The rest is OK, I guess, but it’s just a nuisance to maintain “dual format” sites.
(I see a few comments here about the community’s opinions and leanings, but to be honest it’s not any weirder than your average old-timely IRC channel or fringe Mastodon server—-live and let live, read what you want and just skip Antenna listings you don’t like)
It may have changed but that's what largely turned me off from it. I find other networking projects to have a less preachy mix of people.
The modern web is opt-in. I build and use sites that aren’t SPAs and shitted up with 3p resources and images and code.
HTTP is great, and deserves our time and attention. I get that they seem upset with the modern web, and I am too - but it isn’t HTTP’s fault. It’s the sites you visit.
If you want to build new and smaller communities, I really think we should be building for browsers. Perhaps a website design manifesto is in order.
Text only is just a little to limiting if you ask me.
I’m currently building a browser-based static site generator that produces clean, simple code. But it does more than that.
Alongside the generated HTML, sites also publish their public configuration and source files, meaning they can be viewed in more than just a browser, for example in a CLI or accessibility device.
The client interface is also more than a CMS - you’ll be able to follow other sites, subscribing to updates, and build a network rather like a webring. The idea is to provide human-powered discovery and community tools. The reach may be less than if algorithmic, but it’s designed for genuine connection, not virality.
As the client is smart but sites are simple, sites can be hosted on anything, from the cheapest shared host up.
I’d be happy to talk further if that’s interesting in any way.
You can see an early beta of what I'm thinking about here: https://app.sparktype.org/#/sites
That is a microscopic subset of the modern web.
I don't use Gemini— though I am highly tempted —but I expect some of the attraction is that you can click on any link and pretty much guaranteed not to be socked in the face with a sign-up-for-my-newsletter or cookies-accept-all-yes-later or paragraph-ad-paragraph-ad-paragraph-ad or fiddling with NoScript to find the minimum amount of Javascript that will let you read some article that looks interesting. In Gemini, all that friction just goes away.
A) Develop a whole new transport protocol that does less than HTTP, develop client applications that use this protocol, convince a sufficient number of people to use this protocol, at least to the point where the majority of your activity happens there?
or
B) Install a handful of browser extensions that block ads and other nuisances on the modern www, and have it working right away?
You can’t win a game designed and implemented by a mega corporation which is specially made to earn them money and protect their monopoly by being reactive and defending all the time. Instead you have to change the game and play with your own rules.
That’s option “A”.
That only works if you can convince the a substantial part of the participants to also play your game.
It's very easy to create an alternative internet where we can take away the power from incumbents. The hard part is creating all the activity that is taking place in the current one.
"Oh, but I can mirror the parts I want from the current internet into the new one!"
Not without playing into the same cat-and-mouse game.
For example, I switched to Mastodon, and follow people who I really want to follow are already there, plus I met a ton of interesting people, and was able to see real forms of people I followed before, so I have updated my views on them.
> "Oh, but I can mirror the parts I want from the current internet into the new one!"
Personally, I see Gemini or other protocols as equals to HTTP/S. For example, my blog is already text in most cases, has a full content RSS feed, so, also publishing a Gemini version is not mirroring what's on the web already, just adding another medium to my blog.
If I was pumping a 3rd party site I don't own from web to Gemini with a tool, then you'd be right, but publishing to Gemini is not different than having a RSS feed in my case.
Isn't that strong evidence that it is possible to have a "human-scale" web built on HTTP, and consequently that there is not much benefit in restricting yourself to a protocol that is designed to be limited?
> Personally, I see Gemini or other protocols as equals to HTTP/S
Except they are not. Maybe it can do enough of the things that you care about, but Gemini is (by design!) meant to do less than HTTP.
> publishing to Gemini is not different than having a RSS feed in my case.
Again: if all you want is to be able to publish something in a simple format, then why should we care about the transport protocol?
I get the whole "the medium is the message" idea, I really do. I get that people want a simpler web and I look forward to a time where we have applications developed at a more "human scale". But I really don't get why we would have to deliberately be stripping ourselves of so much power and potential. Talking about Gemini as the best solution to the problems of the modern web is like saying we should wear chastity belts to deal with teenage pregnancies.
In that scenario W3C doesn't put any brakes, Mastodon puts brakes on development, organizational structure and scope, and Gemini puts brakes on the protocol. So, it's the most limited but hardest to abuse in a sense.
I probably worded my "I see them as equals" part of my comment wrong. I know Gemnini is a subset of HTTP, it's more Gopher than HTTP, and that's OK by me. Moreover, that leanness is something I prefer. See, the most used feature on my browser is Reader mode, and I amassed enormous amount of links in Pocket just because of the reading experience it offered.
> I really don't get why we would have to deliberately be stripping ourselves of so much power and potential.
Because power corrupts and gets abused. A friend of mine told me that they now use Kamal which makes deployment easy. How it's deployed? Build a container -> Push to registry -> pull the container on the remote system -> runs the container -> sets up and runs a proxy in front of that container to handle incoming connections.
That's for a simple web application...
I mean, I push files to a web server and restart its process. I'm not against power, I'm against corruption, and given human nature, restraint is something hard to practice, and that's if you want to learn and practice it.
> Talking about Gemini as the best solution to the problems of the modern web is like saying we should wear chastity belts to deal with teenage pregnancies.
I never said Gemini is the only and the best way forward. Again, for me It's another protocol, which offers a nice trade-off for some people sharing a particular set of values. It's like a parallel web like BBSes or public terminals (e.g.: SDF).
Being an absolutist benefits no one. We should learn, understand and improve. We can have multiple webs, and we shall be free to roam them in a way we like. I'd love to use my terminal to roam some text only web with my favorite monospace font and terminal theme, but I like to write comments here and think on the replies I get, too.
I find myself preferring a text-only, distraction-free web more and more, and naturally evolving my habits and personal infrastructure in that way, but I'm not carrying a flag, shouting about end-times and preaching stuff as savior. I'm not that person.
Mastodon may be my preferred social network nowadays, but it's despite the prevalent philosophy from the development team. It's also arguably the reason that the Fediverse can not manage to grow to more than 1 million MAU.
>Because power corrupts and gets abused
The solution to this is not to get rid of power and keep everyone in the same small crab bucket. It's to make access to the powerful tools as universal and ubiquitous as possible.
> I push files to a web server and restart its process.
Your friend not being sensible enough to know when to use a tool vs when to keep it simple is not a problem of the tool. Also, talking about deployment methods seems so orthogonal to the discussion that I am not sure it makes sense to carry this conversation further.
You don’t need another transport protocol.
If your goal is precisely to create an parallel ecosystem that's "airgapped" from the mainstream web, and you're already going to have to develop custom clients, content formats, and server-side configuration to implement it on top of HTTP, and engage in lots of development work to imperfectly isolate the two ecosystems from each other, why wouldn't you just develop a parallel protocol and start with a clean slate?
How do you that with Gemini?
> If your goal is precisely to create an parallel ecosystem that's "airgapped" from the mainstream web
There is no way you can have an air gapped network with public access. The moment this "parallel ecosystem" showed any content that hinted at something lucrative, you will have people creating bridges between the two networks. Case in point: Google and USENET.
You keep it isolated from the ecosystem in which all of those things are taking place.
> The moment this "parallel ecosystem" showed any content that hinted at something lucrative, you will have people creating bridges between the two networks. Case in point: Google and USENET.
The whole point is to minimize the chance of that happening -- by limiting mainstream appeal, keeping it a niche, and avoiding Eternal September -- and to maximize the friction of bridging these two ecosystems. And so far, they've done a fairly good job of it, since Gemini has been expanding for six years without any indication of any of this starting to happen.
There is no friction. It's trivial to write a program that can scrape a Gemini network.
If there is no one pulling data from Gemini servers yet, is not because it's difficult do it, but merely because it's still too small to be relevant.
What's your point, exactly?
We can have all the upside of an http-based web, without dealing with the downsides. The converse is not true. A Gemini network is by design limited in functionality, which is a downside that can not be mitigated.
Right, but that isn't the goal of Gemini. It's goal is to create a distinct ecosystem, not to participate in the existing one with marginally less annoyance.
Different ecosystems only make sense when we have distinct populations that might be as well considered different species.
Having a different ecosystem is the exact intention of this project. If that's not for you, you're certainly not required to participate, but the world is a vast continuum of variation, and is full of niches and clines that are intentionally distant from the global mean. Complaining that non-mainstream stuff exists seems pretty nuts to me -- the world is full of 'distinct populations'.
But they are all sharing the same world. It's all the same ecosystem.
My objection is not because I am against people trying to do something different. My objection is to this delusional idea that this work needs to be isolated from everyone else. It's sterile at best and elitist at worst.
Gemini is not trying to solve a technical problem with the web. Is trying to solve a cultural problem that arises from the web having become a mass medium, in which every site's focus gradually erodes under pressure to optimize to the lowest common denominator.
Creating a new protocol from the ground up, and requiring users to install a distinct client to access it, isn't just about keeping the software aligned with the project's goals, it's about creating a sufficient threshold of thought and effort for participation that limits the audience to people who are making a deliberate decision to participate. It's about avoiding Eternal September, not about creating a parallel mass-market competitor to the web.
It's not about blocking the annoying ads, popups, and trackers, just to access sites where the content itself is full of spam, scams, political arguments, LLM slop, and other assorted nonsense, and instead creating an ecosystem that's "air-gapped" away from all that stuff, filled with thoughtful content produced deliberately by individuals. It's about collecting needles together away from the hay.
Without those incentives, you will quickly find out that there will not be much of an Internet out there.
If you don't believe me, check how many people are on YouTube talking about Open Source, when PeerTube exists and already can reach millions of people.
Well, there is plenty of interesting content on Gemini. If you're OK with having 50% fewer needles in order to get rid of 99.999999% of the hay, then it's a win.
And if you don't care about that and you are thinking from what you might get out of it: an internet where 99% of the content is crap but universal will end up with more valuable content than a neutered internet that can prevent the emergence of crap, but is tailored to appeal only to 1% of the people.
IOW, no one cares about reading all of Wikipedia, and Wikipedia would never reach the size it has if it was something only for a handful of individuals obsessed about their particular hobbies.
No comments yet
I am saying this but I have no idea if AI crawlers have started to crawl gem capsules.
I'm not sure I understand that. HTTP is the fundamental protocol of the web. If your goal is to create an ecosystem that is deliberately set apart from the web, how would using the same underlying tech stack help rather than hinder you in doing that?
> HTTP is great, and deserves our time and attention. I get that they seem upset with the modern web, and I am too - but it isn’t HTTP’s fault. It’s the sites you visit.
And why are those sites so awful? Did they decide to become awful from the outset, or is it because they've gradually adapted to a userbase that has regressed to the mean due to the mass-market nature of the web?
The whole point of developing a new protocol is to create a non-negligible threshold of though and effort for participation, precisely so that it doesn't get popular quickly and end up subjected to Eternal September.
Requiring a markdown content-type would probably even be enough.
Consider the fact that TFA is already proxied over HTTP just so more than 3 people will read it, so it seems more sane to be HTTP native.
But why would you bother with that, when your whole goal is to create an ecosystem that's separate from the web in the first place?
> Consider the fact that TFA is already proxied over HTTP just so more than 3 people will read it. Seems more sane to be HTTP native.
Podcasts are often rehosted on YouTube, blog content is often reposted to social media, etc. Making content viewable from the other medium without making it native to the other medium is a common practice, and wouldn't defeat the purpose of trying to build a distinct ecosystem on top of the same foundation that underlies the ecosystem you're trying to avoid.
I actually don't know of any other way to get them. I suspect I'm not alone. That's how pervasive the dominant platforms are.
Gemini is an application-level client-server internet protocol for the distribution of arbitrary files, with some special consideration for serving a lightweight hypertext format which facilitates linking between hosted files. Both the protocol and the format are deliberately limited in capabilities and scope, and the protocol is technically conservative, being built on mature, standardised, familiar, "off-the-shelf" technologies like URIs, MIME media types and TLS. Simplicity and finite scope are very intentional design decisions motivated by placing a high priority on user autonomy, user privacy, ease of implementation in diverse computing environments, and defensive non-extensibility. In short, it is something like a radically stripped down web stack. See section 4 of this FAQ document for questions relating to the design of Gemini.
Anyone have any hints on getting more use out of it or ways to make it more present in my day to day.
takes a url to a regular webpage and spits out a gemtext version that is much more sparse and, for me, is much more readable.
For example, here's this very website:
gemini://gemi.dev/cgi-bin/waffle.cgi/feed?https%3A%2F%2Fnews.ycombinator.com%2Frss
it's honestly the only reason I still use gemini since the rest of it is abandoned gemlogs, rehosts of web content I don't care or ersatz social media
Still, both communities overlap of course.
Setting up a gopher phlog requires no TLS at all and any machine from 1980 (even maybe ITS with a Gopher client written in MacLisp) will be able to read it with no issues.
Get browser, read some capsules!
[1]https://kristall.random-projects.net/ [2] gemini://warmedal.se/~antenna/
You will need to prefix a gemini URL with "gem " if you're pasting it into the address bar.
Once in a while I check Lagrange (Gemini "browser") for gemlogs I've subscribed to and catch up with what other anons are going through. It tends to be a lot more raw and persona-less than what I find on the web, which I appreciate. It's generally just cozy.
Back when I first discovered Gemini, I wanted to create a space for people to have a voice without needing to run their own server, so I built Station (https://martinrue.com/station). I've been running it ever since.
Gemini in general, and specifically folk who use Station regularly, make it a friendly, throwback-to-the-90s vibe, and I still value it a lot.
However, it works on the basis of mandatory-prohibition. The prohibition is: "You cannot track and exploit your site visitors". This philosophy is enforced 'remotely', by the creators of the Gemini protocol.
An identical end-result can be achieved in HTML, by choosing not to use hostile markup. However, with HTML the prohibition must be enforced 'locally', by the ethical-philosophical position of the website-designer.
The problem with the Gemini-protocol is that it introduces an attack vector: The Gemini 'browsers' themselves. The most popular one is not audited; has a huge code-base; and has relatively few eyes-on-it.
I'm not saying that Gemini protocol is a honey-trap for those trying to exit the surveillance-internet; but if I was a tech-giant / agency profiting from the surveillance-internet, I would definitely write browsers for the Gemini protocol and backdoor them.
As a former "Don't be evil" company, it would be of great interest to me who was trying to exit my 'web'; how; and why :)
Food for thought...
Gemini is an application-level client-server internet protocol for the distribution of arbitrary files"
If I were one, I would consider that to have been buried.
I find maintaining these 2 sites far easier then dealing with html and the *panels I need to use to upload to my old WEB site.
People who have never viewed Gemini are missing some decent content.
gemini://skyjake.fi/~Cosmos/
I noted there were a few capsules that acted as a sort of hub for other peoples capsules. which suggested to me there was a way to automate it, and I might be able to make my own
Yes: https://geminiprotocol.net/docs/companion/subscription.gmi
Many clients, including my favorite, Lagrange (https://gmi.skyjake.fi/lagrange/) support feed subscriptions.
1. https://tildegit.org/sloum/spacewalk
2. https://github.com/kensanata/moku-pona
gemini://skyjake.fi/~Cosmos/view.gmi
gemini://warmedal.se/~antenna/
gemini://calcuode.com/gmisub-aggregate.gmi
gemini://gemini.circumlunar.space/capcom/
gemini://tinylogs.gmi.bacardi55.io/
gemini://sl1200.dystopic.world/juntaletras.gmi
I will stick to gopher, as it is mature and much friendlier to low spec / retro machines.
Gopher often sucks for 40x25 devices or mobile ones. Yes, word wrapping, but everyone uses the 72 char limit or even doesn't give a heck and I have to set my own $PAGER calling fmt, fold or par before less.
On TLS, you are right. But I've got to build BearSSL and some libreSSL for for Damn Small Linux. The 2.4 kernel one, were ALSA was a novely and DMIX was hard to set, the one you got with Debian Woody... with the bf24 at the LILO prompt.
So, if DSL can run some BearSSL based OpenSSL-lite client, a gemini client for it should be totally doable.
It's totally fine to prefer gopher for its maturity (I'd vehemently disagree, but that's for another day) or compatibility with retro machines, but framing someone else's creative project as a character flaw is just rude.
I think there is room for things like media and applications even on an idealized web. But they should not necessarily be combined along with information browsing and search into one thing.
https://github.com/runvnc/tersenet