return to table of content

Show HN: Remove-bg – open-source remove background using WebGPU

Birch-san
25 replies
3d20h

feels like it could be nice to abide by the license terms https://bria.ai/bria-huggingface-model-license-agreement/

1.1 License. > BRIA grants Customer a time-limited, non-exclusive, non-sublicensable, personal and non-transferable right and license to install, deploy and use the Foundation Model for the sole purpose of evaluating and examining the Foundation Model. > The functionality of the Foundation Model is limited. Accordingly, Customer are not permitted to utilize the Foundation Model for purposes other than the testing and evaluation thereof.

1.2.Restrictions. Customer may not: > 1.2.2. sell, rent, lease, sublicense, distribute or lend the Foundation Model to others, in whole or in part, or host the Foundation Model for access or use by others.

The Foundation Model made available through Hugging Face is intended for internal evaluation purposes and/or demonstration to potential customers only.
Laaas
12 replies
3d19h

The repo doesn’t include the model.

NewJazz
11 replies
3d19h

Does the site not distribute it?

benatkin
10 replies
3d19h

It doesn't, except that it runs it. There's no download link or code playground for running arbitrary code on it, so while technically it transfers the model to the computer where it's running (I think) it's not usually considered the same as distributing it.

NewJazz
6 replies
3d18h

Yeah, that doesn't sound right to me.

benatkin
5 replies
3d17h

What's the point of running it in WebGPU then?

I think it's either running the model in the browser or a small part of it there. Maybe it's downloading parts of the model on the fly. But I kinda doubt it's all running on the server except for some simple RPC calls to the browser's WebGL.

jfoster
3 replies
3d16h

Anyone can easily do a online/offline binary check for web apps like these:

1. Load the page

2. Disconnect from the internet

3. Try to use the app without reconnecting

benatkin
2 replies
3d16h

Well, my question is about where it lies within the gray area between fully online and fully offline, so that wouldn't work.

Edit: Good call! It's fully offline - I disabled the network in Chrome and it worked. Says it's 176MB. I think it must be downloading part of the model, all at once, but that's just a guess.

The 176MB is in storage which makes me think that my browser will hold onto it for a while. That's quite a lot. My browser really should provide a disk clearing tool that's more like OmniDiskSweeper than Clear History. If for instance it showed just the ones over 20MB, and my profile was using 1GB, at most it would be 50, a manageable amount to go through and clear the ones I don't need.

jfoster
1 replies
3d13h

Yeah, this is why I think browsers need to start bundling some foundational models for websites to use. It's too unscalable if many websites start trying to store a significantly sized model each.

Google has started addressing this. I hope it becomes part of web standards soon.

https://developer.chrome.com/docs/ai/built-in

"Since these models aren't shared across websites, each site has to download them on page load. This is an impractical solution for developers and users"

The browser bundles might become quite large, but at least websites won't be.

nkrisc
0 replies
3d9h

As long as there’s a way to disable it. I don’t want my disk space wasted by a browser with AI stuff I won’t use.

NewJazz
0 replies
3d17h

What's the point of running it in WebGPU then?

Use client resources instead of server resources.

papa_bear
2 replies
3d17h

Pretty sure downloading it to your browser counts as distributing it, legally speaking.

benatkin
1 replies
3d17h

AYAL?

awwaiid
0 replies
3d5h

Sure!

jfoster
3 replies
3d19h

A lot of these AI licenses are a lot more restrictive than old school open source licenses were.

My company runs a bunch of similar web-based services and plan to do a background remover at some stage, but as far as I know there's no current models with a sufficiently permissive license that can also feasibly download & run in browsers.

sangnoir
2 replies
3d17h

Meta's second Segment Anything Model (SAM2) has an Apache license. It only does segmenting, and needs additional elbow grease to distill it for browsers, so it's not turnkey, but it's freely licensed.

jfoster
1 replies
3d16h

Yeah, that one seems to be the closest so far. Not sure if it would be easier to create a background removal model from scratch (since that's a more simple operation than segmentation) or distill it.

jaxn
0 replies
3d15h

I got pretty far down that path during Covid for a feature of my saas, but limited to specific product categories on solid-ish backgrounds. Like with a lot of things, it’s easy to get good, and takes forever to get great.

EMIRELADERO
2 replies
3d18h

AI model weights are probably not even copyrightable.

thenickdude
1 replies
3d17h

Surely they would at least be protected by Database Rights in the EU (not the US):

The TRIPS Agreement requires that copyright protection extends to databases and other compilations if they constitute intellectual creation by virtue of the selection or arrangement of their contents, even if some or all of the contents do not themselves constitute materials protected by copyright

https://en.wikipedia.org/wiki/Database_right

EMIRELADERO
0 replies
3d17h

Those require the "database" in question to be readable and for every single element to be so too. Model weights don't satisfy that requirement.

littlestymaar
1 replies
3d11h

Keep in mind that whether or not a model can be copyrighted at all is still an open question.

Everyone publishing AI model is actually acting as if they owned copyright over it and as such are sharing it with a license, but there's no legal basis for such claim at this point, it's all about pretending and hoping the law will be changed later on to make their claim valid.

NewJazz
0 replies
3d3h

Train on copyrighted material

Claim fair use

Release model

Claim copyright

Infinite copyright!

xdennis
0 replies
3d7h

It's kind of silly to complain about not abiding by the model license when these models are trained on content not explicitly licensed for AI training.

You might say that the models were legally trained since no law mandates consent for AI training. But no law says that models are copyrightable either.

giancarlostoro
0 replies
3d4h

At some point the worlds going to need a Richard Stallman of AI who builds up a foundation that is usable and not in the total control of major corporations. With reasonable licensing. OpenAI was supposed to fit that mold.

akpa1
11 replies
4d

Very cool idea, but for me it completely doesn't work. I'll open an image, the entire computer (running Linux) will briefly freeze (including stopping Spotify in its tracks) and when it comes back, the only thing that's left is a message from Firefox telling the tab has crashed.

lxgr
5 replies
3d21h

A website being able to do this is arguably a browser bug.

If a browser's sandbox can't even protect against accidental resource exhaustion, I'd be very concerned about that as an intentional attack vector.

ipaddr
3 replies
3d20h

What browser protects against resource exhaustion? Chrome and all variants do not. Running out of memory or cpu or even hard drive space can happen and does in all browsers.

lxgr
2 replies
3d19h

Every browser I know meters all APIs capable of using disk space, and 100% CPU usage doesn’t hang your system on any reasonable OS.

Memory can indeed be a problem, but at least if a tab becomes the largest single memory user on my system, the OOM killer will come for it first.

So if for CPU and memory browsers can lean on the OS for proper resource management but it’s not the case for GPU, maybe their WebGPU implementations aren’t ready for production yet.

codedokode
1 replies
3d5h

Before OOM killer activates, the system will swap out all other applications including desktop shell. If you are unlucky to use a spinning hard drive, your system will freeze for a long time.

This is a problem with popular Linux distributions, they do not have protection against application swapping out important system applications.

Linux has disk quotas and CPU scheduler but it doesn't have fair memory and swap management (or this is not configured out of the box). For example, desktop shell doesn't have protection against being swapped out.

lxgr
0 replies
2d21h

Ah yes, that is indeed very annoying. That's why I usually don't configure a swap partition on Linux, although that probably ends up wasting a lot of memory on write-once-read-never pages that could otherwise be used for the page cache.

A per-process real memory maximum that automatically invokes the OOM killer unless somehow opted out would probably be useful.

codedokode
0 replies
3d5h

Browsers do not protect from tab using all available memory and swapping out other applications including desktop shell.

jml7c5
1 replies
3d20h

Sounds a bit like running out of memory, followed by the OOM killer going after the process... though I'm not sure the OOM killer would act that quickly. Do you get any interesting messages in `dmesg`?

akpa1
0 replies
3d20h

You're spot on - dmesg shows it's OOMing, and pretty fast too. Turns out that it happens even if I just open the web page, yet alone start trying to remove the background of an image, so I'd guess it's something to do with loading whatever model is being used.

anduc
1 replies
4d

I tried it on my Chrome-Mac device, but not sure how it works on other devices (assuming it works). I'll test it out later. Sorry for the inconvenience

akpa1
0 replies
4d

Please don't apologise for any inconvenience! It's a cool project and I'd love to see it working, I'm just sorry I can't give any positive feedback

cmgriffing
0 replies
3d20h

Looks like WEBGPU is only on nightly for FF, for now.

beeboobaa3
7 replies
4d

Status: Error

Firefox on Linux: Error: Unsupported device: "webgpu". Should be one of: wasm.

Chromium on Linux: Error: no available backend found. ERR: [webgpu] Error: Failed to get GPU adapter. You may need to enable flag "--enable-unsafe-webgpu" if you are using Chrome.

Passing the --enable-unsafe-webgpu flag results in the same error.

beeboobaa3
0 replies
4d

This worked, thanks!

arendtio
0 replies
3d21h

This!

anduc
1 replies
4d

Maybe it's because of how I detect the GPU and switch to another backend (to support devices that don't support WebGPU).

Can you try to go https://pmndrs.github.io/detect-gpu/ and pass the result here

beeboobaa3
0 replies
4d

{ "fps": 60, "gpu": "amd radeon r9 200", "isMobile": false, "tier": 3, "type": "BENCHMARK" }

It's a AMD Radeon RX 6600.

It worked on Chromium after passing --enable-unsafe-webgpu --enable-features=Vulkan

slimsag
0 replies
3d16h

Firefox doesn't support WebGPU FYI

cwillu
0 replies
4d

“chrome://gpu/” may give more clues as to what went wrong.

sva_
5 replies
3d22h

It would be cool if it could ask before loading the model, or at least indicate to me how large the download will be, as I'm on a metered connection right now.

But maybe that's just a me-problem.

zamadatix
3 replies
3d20h

Looks like ~4 MB, I think that's a fair size to not throw up warnings about (unless I'm missing something in the Network view of dev tools w/o cache). That said I wonder what people consider the "Click to enlarge (may take a while to load)" courtesy size to be in 2024.

lynguist
1 replies
3d20h

I would probably consider 50 MB that size, or in the special case of metered connections 20 MB (for example downloading maps or so).

jfoster
0 replies
3d16h

50 MB might be fine for desktops on effectively unlimited & high speed connections, but consider the case of a mobile user with a few GB of data per month. Might be unacceptable for them. Not sure how common that case is in the US, but certainly possible outside the US.

sva_
0 replies
3d7h

Looks like ~4 MB

You got me!

The model was 176 MB. Total pageload transferred 182 MB.

https://imgur.com/a/6xx3Lgu

It doesn't seem like "Disable cache" in the DevTools empties the Cache Storage.

Loughla
0 replies
3d18h

After living with satellite Internet as the only option for about 15 years, now that I have fiber, I still catch myself declining downloads that are too big and opening the scheduler.

Old habits die hard.

And the modern Internet implicitly assumes the end user is not on a metered connection. Websites are fucking massive these days.

simjnd
5 replies
4d

Very cool, but man pulling 900+ dependencies to build and run this thing feels awful. The NPM culture is out of control.

ashishb
2 replies
3d22h

JS standard library is very limited. So, pulling in third party dependencies to even left pad a string is normal.

throwaway63820
1 replies
3d22h

you mean the native `str.padStart(targetLength, padString)`

zamadatix
0 replies
3d20h

To be fair (as one who leans pretty heavy in favor of the JS world) .padStart() was only added in response to the aforementioned left-pad fiasco. The language adding that was more a face saving measure from the blowback than an attempt to fix the actual problems.

Despite all that, left-pad still gets > 1 million weekly downloads on npm.

anduc
1 replies
4d

Haha, I didn't notice this! Yeah, only Vite will reduce the dependencies a lot, but I'm not sure which one is the biggest here to talk about

Will try to the deps simpler later, after all this is about a little time of work

thelastparadise
0 replies
3d23h

Will try to the deps simpler later, after all this is about a little time of work

Word of tip, you may not want to jump on every last suggestion from the peanut gallery instantly

anduc
1 replies
4d

It's nice to see it here, imagine it works like your repo but runs entirely in the user's browser, very cool, doesn't need any complicated setup

nadermx
0 replies
3d19h

Yeah, that is awesome

afuqya
1 replies
3d13h

@nadermx Does it work in real time for webcam? I ask that because you have an animated gif in your link.

nadermx
0 replies
2d16h

I haven't messed much with it, but it would be possible to probably have ffmpeg pull the stream and run it at 30 fps blocks. Can do a pull request if you want

amelius
4 replies
3d20h

Why can't I just "apt-get install" tools like this?

I used to have all the powerful tools at my fingertips but now I feel like my Linux distro is slowly disappearing into a vortex of irrelevance.

solardev
0 replies
3d17h

For every Linux power user, there are probably a million regular web users who don't know (or care to know) what a command line is. Web tools are for them.

Why not fork the repo and repackage it as a CLI app if you really want it that way?

kinduff
0 replies
3d19h

I wouldn't blame apt-get, there are a lot of command-line tools nowadays. I don't usually install these if its not going to be a frequent task.

Anyways, if you're looking for a cmd tool, rembg [0] is a pretty good one.

[0]: https://github.com/danielgatis/rembg

fulafel
0 replies
3d4h

People are trying to write software for GPUs more commonly now + because of the troubled GPU programming landscape, it's hard to make programs that work more than one vendor's proprietary flavour of GPU. WebGPU is an attempt to address some of that in browser apps, but it's not ready yet.

akdev1l
0 replies
3d19h

Because someone just created it so no one has packaged it yet

fragmede
3 replies
4d1h

I'm impressed by it working, since the olden days this was a painstaking job in photoshop, and I haven't been keeping up with SOTA, but I'm also impressed at how little code there actually is in src/ai.ts to make it happen.

Good job!

wongarsu
0 replies
4d

It's pretty much one click in photoshop now too. With better results and more control than what this tool is offering.

Not that I'm blaming this tool for being worse than a $200+/yr product. If anything it's impressive how close it gets with so little code. And if you just want rough results on a large number of files it even looks superior

seanthemon
0 replies
4d

A lot of tools exist for this, even on my macbook i can just right click, quick action and remove background.

anduc
0 replies
4d

Same here, this is the first time working with this library to me, it's really making me believe even more in the future of Transformer.js/WebGPU, it just beginning

andrewstuart
3 replies
3d22h

"Remove Background" is at Tools/Remove Background in MacOS Preview.

nextaccountic
1 replies
3d21h

This one is open source.

stephenr
0 replies
3d9h

What are you basing that assertion on? There’s no licence information provided in the repo or on the site, that I can see.

stephenr
0 replies
3d21h

Or just press and hold (iOS/iPadOS) / right click (macOS) on the subject and choose "copy subject".

tech-no-logical
2 replies
3d23h

firefox / linux. I only get fully transparent output : https://i.imgur.com/kcu2LSR.png

nothing interesting on the console.

slimsag
0 replies
3d16h

Firefox doesn't support WebGPU today.

nabla9
0 replies
3d22h

Same here. Plus error with Chromium.

Working with just one setup is midpoint of web development.

maven29
1 replies
4d1h

How does this compare to "segment anything" from Meta

m00x
0 replies
3d20h

Much smaller and better at background removal, but doesn't segment everything.

ramonverse
2 replies
4d1h

Opened, uploaded image, looked good (it worked!). Then my browser (Arc) started freezing, unfroze after closing your website :/

anduc
1 replies
3d11h

Sorry for the inconvenience, I've tested with Arc and can confirm this error, I didn't know why this happens. I will check further to determine the reason and report it to the upstream library (I think it's because the engine or Arc itself)

vednig
0 replies
2d7h

Since, Arc is built over chromium and does not have active inbuilt chrome popups and error dialogues.

Most probably it because of cache utilization and network overload, can simply be solved with clearing cache and creating a service worker for managing download of model and invalidating memory.

ivolimmen
2 replies
3d12h

Well gave it 2 tries before giving up. Does not do anything in FireFox and in Chrome it also failed after loading some data. It told me to restart Chrome with some GPU unsafe flag.

anduc
0 replies
3d12h

Someone mentioned > Looks like WEBGPU is only on nightly for FF, for now

anduc
0 replies
3d12h

Hey there, sorry for the issue, does it show status "Error"? if so can you try to enable WebGPU using this flag "--enable-unsafe-webgpu --enable-features=Vulkan", if not it doesn't seems like a common error and I will spend more time on testing later

Feel free to raise an issue on Github to keep track progress, it helps open source a lot, or you can DM on x.com/duc__an anytime

ericol
2 replies
4d1h

The very first image I uploaded (A model lighthouse with a very obvious background) gives just "Error".

anduc
0 replies
4d

The error is ambiguous right now and I'll try to make it clearer (welcome for contributions) The idea is it fallbacks to not use webgpu if your browser is not supported, but it was made in 2 hours, bug is acceptable :)

anduc
0 replies
4d

Exactly this, as mentioned in the post I've used the same technology with this playground (copied lots of code from here) What I do is mostly make the UX better

PS: WebGPU is the future

mvdtnz
1 replies
3d19h

Not sure if you realise you've given it the same name as another product that does this exact thing. (Disclosure I work for Canva, the company which owns remove.bg).

https://www.remove.bg/

anduc
0 replies
3d12h

Ah, I didn't realize that there is any product with a similar name, and in case I already know, I think I'll still pick this name since it's only an experimental open source project with WebGPU. Remove-bg is quite common IMO, but definitely will affect SEO if I want to do more for the project, but I have no plan yet

lelandfe
1 replies
3d22h

Thanks for sharing; it looks like this has trouble with certain kinds of images. Here's likely the most representative example:

Original: https://imgur.com/a/NrEXfua

BG removed: https://imgur.com/a/JWKHVGE

Much of the background was untouched, and almost all of the actual data (the axis and bars) were removed instead.

anduc
0 replies
3d12h

Sorry for my late I missed the comment

Yeah I think it's because the quanlity of the model, hopefully we will have better quanlity in near future. I will see if anything I can do with the settings

kamjin
1 replies
3d16h

I tried your site using the image: https://bannerify.co/tools/remove-bg I crashed the page after 10 or so seconds and Chrome prompted me: Crashpad_NotConnectedToHandler, it doesn't work!

anduc
0 replies
3d12h

can you raise this issue on Github? I will take a look into this later since it's not common error. I think it's because some settings on the browser

I've heard someone reported that Arc had a similar error

kamjin
1 replies
3d16h

Can you take a look at this open source background removal project?

https://www.reddit.com/r/StableDiffusion/comments/1dwkwrx/i_...

It's based on python and neural network modeling, and I was wondering if it's possible to run it via webgpu? Or run it based on wasm? No offense, just because it looks awesome.

anduc
0 replies
3d12h

First time I see this, by the first impression I think the output quality of this one is better than mine and my code is only based on one model

In my understanding, It'll possible if the model's author build to https://onnxruntime.ai ONNX Runtime. And maybe the downside is user will need to download ton of data to their device, currently it's ~100-200mb

hokkos
1 replies
3d23h

why do you use a canary version of an old react version ?

anduc
0 replies
3d12h

Hi there, it's because I copied the project template from my previous project, will check and try to upgrade to the React 19 soon if it'll work with the libraries

cutiepatootiee
1 replies
3d23h

Looking forward to testing this out and sharing w network if it works well

anduc
0 replies
3d12h

Thank you, looking forward to this, if you have any feedback please let me know via x.com/duc__an

cutiepatootiee
1 replies
3d16h

It's not working for me

anduc
0 replies
3d11h

Can you provide more additional information for me? If you're using Linux and Chrome you probably need to pass GPU flag `--enable-unsafe-webgpu --enable-features=Vulkan`

codedokode
1 replies
3d5h

You might want to warn about memory usage. I had Chromium consume several GBs of RAM and had to terminate it.

But it is great that there now is more offline tools. It would be great if there was a browser API that allowed the page to voluntarily go offline to guarantee that no data will be leaked.

anduc
0 replies
3d4h

Several GBs of RAM definitely too much, I'm not aware that this consumes that much memory. I will spend more time trying to figure out how much memory is minimum necessary for this and will warn in the page.

zurtri
0 replies
3d19h

I tried it on an image of me and a horse in front of a paddock and it worked perfectly. Really impressed.

Other background removal tools would find me - but erase half the horse or erase his tack or his ears. This tool worked perfectly.

Very well done!

san_dimitri
0 replies
3d2h

Thanks for sharing this repo. While I don't have time to actively contribute to the code, I have been testing on images to share my feedback for future devs.

1. Background removal is working good on a lot of different types of images. This includes images with background, plain or white background, men, women, children, hair, and pets.

2. After background removal, the new image is warped in some areas. For example, I have a picture of a child eating ice-cream. The background was removed perfectly but left a lot of artifacts on the child. I can share those images for testing.

Please let me know if there are other areas I can test.

makifoxgirl
0 replies
3d4h

I've tried many different background removal algorithms and found that InSPyReNet has been the most successful one. I use `transparent-background` on pip a whole bunch and it rarely fails on me compared to remove-bg and whatnot

https://github.com/plemeri/transparent-background

jacobn
0 replies
3d19h

Nice work! Making AI accessible directly in the browser (even if you have to download quite the payload for the model parameters) is a real accessibility game changer.

Shameless plug: if you prefer API-based background removal we offer a super low cost, high quality option on https://pixian.ai

hauschildt
0 replies
3d4h

Hi,

nice work. I recently also published a WebGPU version of our browser-only background removal library. It's using the onnx-runtime under the hood. Weights are from isnet. It could also run birefnet – if there is some interest – however BirefNet weights are almost 1GB in size, which is a bit much to download I guess.

There is a blog post about it and also a CPU only version available.

https://img.ly/blog/browser-background-removal-using-onnx-ru...

Source is available at: https://github.com/imgly/background-removal-js

and on npm: @imgly/background-removal

Feel free to check it out!

cultureulterior
0 replies
3d23h

Nice, works perfectly on win/chrome

codethief
0 replies
4d

Ahh, for a second I thought https://remove.bg had open-sourced their product (which I've been quite happy with on the few occasions I've used it).

Very cool, though!

anduc
0 replies
3d4h

Hi guys, thanks for your interest in my little project, I'm glad you all like it.

I just spend my free time to make it much better based on your feedback

- Mobile support, tested on my iPhone

- GitHub README added

- Added medium zoom for easier viewing on Mobile

- The error banner makes it more friendly and easy troubleshooting

- Troubleshooting section in README

If you have any idea to make the UX better, please let me know, I'll appreciate it

Flop7331
0 replies
4d

Non-descript error after 180 seconds. Vivaldi on Android.

Please turn off the rolling animation for the duration timer. It looks really wrong when the numbers wind back (which they wouldn't do on a rotor) and when the trailing zeroes vanish.

Akash_deshmukh
0 replies
3d10h

thanks ! finds useful .