Replies: 1 comment
-
|
If you're comfortable with writing scripts (or letting llms write them for you), karakeep has a way of skipping the actual crawling and relying on "pre crawled archives" (it's what we use with singlefile). You upload the archive to the asset db, then when creating a bookmark you pass the asset id as "preCrawledAssetId". Karakeep then will extract the metadata from this html instead of crawling the website. Pdfs you can attach afterwards via the api as well. Happy to provide more details if that's something you're up for. Just pointing claude or something against both services' API specs, it'll be able to figure it itseld :) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I've been using Linkwarden for some time and have around thousand links archived there. Been considering switching to Karakeep, but the lack of postgres support kept me back. However since Linkwarden development is kinda stalled, i am thinking about switching regardless of Postgres.
But my issue is, that many of the links i have saved in Linkwarden are no longer available on the internet. Linkwarden basically exports only the urls and metadatas, so it cannot be imported into Karakeep out of the box if the sources are no longer available.
Is there a way how i could "inject" the html, pdf and json readable formats from Linkwarden into Karakeep?
Beta Was this translation helpful? Give feedback.
All reactions