Show HN: I Built ImgFiber-Better Image Optimizer. Free No Limits

40 iambavith 39 4/6/2025, 6:55:55 AM imgfiber.com ↗
No file size/upload limits. Processed locally right inside your browser. No Server Uploads.

NOT A FFmpeg Wrapper.

It's not like any other Alternatives, Try for yourself!

Are you someone who deals with lots of images and always find yourself with your storage full? or Someone Who deals with websites optimization and would love to get that fast loading speed? No matter who you are, as long as you deal with Images, Imgfiber got your back!

⬇ Reduce image file sizes by up to 95% without losing quality! Supports all major formats: JPG, PNG, JPEG, GIF, SVG, and WebP. Works entirely in your browser—no uploads, no servers, just the power of your own device! Blazing fast compression—processes images as quickly as you drag and drop. Delivers results 2x better than competitors like OptImage, CompressX, TinyPNG, and Squoosh. Totally FREE with no file size or count limits - compress as many images as you want!

Social Proof? - I am gonna be honest with you! I don't have social proof! I've been too busy building cool tools like IMGFiber, Codeaway, QuickWrap entire year that i forgot they need Lovely users to have significant value to its existent. I've spent an year building great range of SaaS and had zero exposure to provide your with trusted by 40,000+ users around the world! Nah! I don't have that! that's why i am here!

I would geneuinly love for you to give ImgFiber a shot!

Check it out ImgFiber.com and let me know your thoughts!

Comments (39)

bertman · 24d ago
iambavith · 24d ago
I built the image compression for imgfiber using CompressorJS as the base. It’s a reliable library by Feng Yuanchen (https://fengyuanchen.github.io/compressorjs/), originally designed for lossy compression. I wrapped and modified it to meet our specific needs, tweaking the code so the results differ from the default - you won’t get the same file size or optimization when using it out of the box. Full credit for the core library goes to the original author; my work was to modify it for our usecase. to solve our own personal pain point.
bertman · 24d ago
Using CompressorJS means lossy image compression. You really shouldn't be calling this "Instant, Lossless File Compression" on your page.

Also, because you're using CompressorJS's default settings, you should know that PNG files > 5 MB are converted to JPG (https://github.com/fengyuanchen/compressorjs?tab=readme-ov-f...). Your site, however, keeps the .png file name ending for the converted image, again falsely suggesting lossless compression.

mootothemax · 24d ago
> I wrapped and modified it to meet our specific needs, tweaking the code so the results differ from the default - you won’t get the same file size or optimization when using it out of the box.

I presume this is an oversight; changing CompressorJS's quality setting from its default of 0.8 to 0.75 results in the exact same output as ImgFiber.

Full settings dump:

  {
    "strict": true,
    "checkOrientation": true,
    "retainExif": false,
    "maxWidth": null,
    "maxHeight": null,
    "minWidth": 0,
    "minHeight": 0,
    "resize": "none",
    "quality": 0.75,
    "mimeType": "auto",
    "convertTypes": [
        "image/png"
    ],
    "convertSize": 5000000,
    "beforeDraw": null,
    "drew": null
}
iambavith · 23d ago
the settings dump you dropped will not deliver on same outputs :)
RamblingCTO · 24d ago
Isn't everything a wrapper for something in the end? Weird comment ...
yjftsjthsd-h · 24d ago
Is CompressorJS a wrapper?
RamblingCTO · 24d ago
I mean software product. Everything commodities something in the end imho
bertman · 24d ago
Yeah, but if you're trying to promote your AI generated React bloat wrapper in a "Show HN", you should at least be honest and describe how it's actually working instead of writing mindless, non-coherent, intentionally misleading ad blurbs without any actual information imho.
iambavith · 23d ago
It's not a AI generated React bloat wrapper as you seem to think somehow.

We did not lie about actual working when fellow developers demanded answers. I love talking about what i have build. Maybe just nudge better? instead of hating?

RamblingCTO · 23d ago
Welcome to HN ;) Lots of nay-sayers that haven't build a thing ever. Keep at it and don't worry.
CyberDildonics · 23d ago
What is it that you think they did? They have large claims and can't answer a single technical question. They don't even seem to know the difference between lossless and lossy compression or a png and a jpeg.
RamblingCTO · 22d ago
But that's my point: you gotta learn to build products and the hate and criticism is sometimes really unbelievable. You're exposing yourself to the world. And in the end you pay for a service, no matter if they wrapped compressionjs if it adds value. And sometimes that value takes a bit of iteration. Instead of using positive reinforcement, the comments feel pretty hateful at times. HN is often not a supportive environment, but would be awesome if it would be, right?
CyberDildonics · 22d ago
Did you read through this thread? What you are saying is completely detached from what they are saying. They are claiming they made a new product/service and that it beats all sorts of competitors. They aren't claiming they are learning or doing something for the first time.

Then people say that they can't get any results from it and they can't answer basic questions. They even put jpg and jpeg as two different formats.

People are being exceptionally nice in this thread from someone who probably copy and pasted from code generation to try to get free advertisement and make a quick buck.

The comments aren't 'hateful' they are asking the most basic technical questions and pointing out that none of these claims are actually true.

CyberDildonics · 23d ago
Well said
eviks · 23d ago
But we're not at the end yet
Daiz · 24d ago
Lossless compression means the pixels of the output image are 100% identical to that of the output image.

This site is clearly not doing that, and should thus not be called "lossless compression". The industry term for lossy compression that appears to be basically the same as the source is "visually transparent" or just "transparent". Though this doesn't seem like that either, especially when you're compressing large PNGs into JPGs while naming the output as PNGs. That's just outright deceitful, and not a good look.

iambavith · 23d ago
judging by the response we got , we will update site to reflect on “nearly lossless” approach than calling it “lossless” then We don’t mean to be deceitful.
pornel · 24d ago
In image compression "lossless" is a term of the art. What you're doing is a practically useful quality degradation, but it's not lossless.
umtksa · 24d ago
I just tested with a folder full of jpegs and I didn't even have to compare to see the artifacts kind of "looseless"
mootothemax · 24d ago
What am I missing about the 2.8mb example image not reducing to 698kb as it says on the homepage?

I downloaded https://www.imgfiber.com/compare/original.jpg and dragged it into the try-it-now section - says it's now 1.3mb.

That's worse than Squoosh's default 897kb (https://squoosh.app/editor), though better than CompressorJS's default 2.21mb (https://fengyuanchen.github.io/compressorjs/), (edit ii) tho changing CompressorJS's quality setting to 0.75 results in the exact same output as ImgFiber.

Edit: this is on latest Chrome + macos.

Doohickey-d · 24d ago
As an alternative to this, I quite like https://squoosh.app/ - Also works on-device, but additionally it's open source, supports multiple output formats, and exposes all the speed / quality / transparency options.
iambavith · 24d ago
Squoosh is great but for average user dealing with so many peremeters feels overwhelming and provides odd results.When you're handling a large number of files - like a Midjourney archive or a collection of images,it’s more convenient to use an app with preset settings that can process them in bulk, rather than one with detailed options that require fine-tuning.
todotask2 · 24d ago
I would suggest avoid using in Firefox, the images can appear to be brighter than converting in other browsers.
starwatch · 24d ago
Sidenote, but I enjoyed seeing the squoosh maintainers talk through [1] how they diagnosed and resolved a memory leak they had a while back.

[1]: https://youtu.be/YDU_3WdfkxA?si=n-ZZwRI9V51p-Pxc

WoodenChair · 24d ago
Are you using a known tool or library? If not, what’s the algorithmic technique?
iambavith · 24d ago
we're using highly modified CompressorJS library to get this results. hope that helps.
bflesch · 24d ago
Can you explain a bit how working with JPG format was different than working on PNG format? How did you achieve significant savings for both formats, given that many clever people have worked on this before?
iambavith · 24d ago
We focused on lossless techniques like metadata cleanup, optimal encoding settings, and structural compression—for both JPG and PNG.
CyberDildonics · 24d ago
Does this mean that some of your 'compression' is just deleting meta data?

structural compression

What does this mean?

iambavith · 23d ago
Yes, some gains come from stripping metadata, but structural compression means optimizing how image data is encoded, like better Huffman tables for JPGs or smarter filtering for PNGs.
CyberDildonics · 23d ago
some gains come from stripping metadata,

That's not compression, that's just deleting people's data.

structural compression means optimizing how image data is encoded, like better Huffman tables for JPGs or smarter filtering for PNGs.

I think that's just called compression.

How are you getting better huffman tables for JPGs and how are you doing 'smarter' filtering for PNGs ?

I'm asking for deeper technical explanations because not only would beating current image compression libraries be a technical feat, a lot of people in this thread think that you aren't actually doing anything differently.

james-bcn · 24d ago
I'm a big fan of Clop. https://lowtechguys.com/clop/
iambavith · 23d ago
love the drag and drop to optimize approach. so cool.
miyuru · 24d ago
Is the tool working? I uploaded a couple images and it gave out 0 compression.

Also I am not sure if you thought the product through, when I want to compress images I want to control the output and also control the quality setting. This does not seem to do anything.

iambavith · 23d ago
yes tool works. its weird your images didn’t show any reduction in size. would love to try fixing it.

we did thought it through based on our use case :) that is working on bulk image processing where a slight loss in quality for Higher reduction in size is appreciated :) mostly for wallpaper apps or any image related sites. but yeah we will improvise to offer even cleaner approach

team_groovy · 24d ago
I really like the kewltools image file size reducer

https://kewltools.com/image-resizer

Just enter in the desired file size and it will compress down to it

iambavith · 23d ago
wow. really love this. thank you for mentioning it. its super useful for uploading on government websites they really love limiting uploads