This is probably not the right place, but I figured I’d give it a try. I’ve been trying to copy this website so I could have offline access when tooling around the Channel Islands. I could (and probably will), just copy/paste whatever info I need for my little trips, but the fact that I can’t copy it ass over is annoying the hell out of me. I’ve tried variations of wget -r ... as well as httrack with no success. Anyone have any idea how I can get this?

  • redcalcium@c.calciumlabs.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    Another option is to use the SingleFile browser extension. You can install it from your browser’s extension marketplace.

    SingleFile can download an entire web page into a single html file while preserving the layout and images. It basically inline everything into a single file, which is very handy. The drawback is you’ll have to do it manually on every page you want to save, but it works really well. Even the google map embed in the page you linked got preseved as well (it’s not interactive anymore though because it’s not preserving javascript).

    • sin_free_for_00_days@lemmy.oneOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Oh, that is nice. Thanks for the link. That works great, basically similar effort to just copy/pasting, but with far better results.

  • thereddevil@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Do you want it for one trip or all?

    For one trip you select it and then save as html from your browser.

    For all trips if there is a request being made to the backend to get data I’m not sure it’s possible

  • jablz@vlemmy.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I’ve used ArchiveBox as a desktop app (on Linux Mint) which saves a snapshot of URL’s you feed it. Worked for the sites I needed when seeking an offline solution.