On Fri, 25 Jan 2008 15:38:54 -0500, Moshe Goldfarb wrote:
> On Fri, 25 Jan 2008 20:23:32 +0000, [H]omer wrote:
>
>> Verily I say unto thee, that Roy Schestowitz spake thusly:
>>
>>> WebCopier for Linux - Get a Copy of Your Favorite Websites and Browse
>>> Them Offline at Any Time.
>>
>> Why not just use lftp or wget?
>
> Tell me again how user friendly Linux is.
*Extremely* so. It is, however, _different_.
Okay, I'll pose you a challenge. I'll use nothing but Linux CLI tools,
such as wget and bash and Apache. You can use whatever you like, as long
as it is *not* open source, runs in Windows, is GUI based and doesn't
require any scripting, just pointy-clicky type interfaces. Oh, and each
step is performed by a different app, tool, or script.
Step 1:
>From each of userfriendly.org, dilbert.com and one other cartoon site,
retrieve today's cartoon. Save it in a directory with the others -
whether they all go in the same directory, or one directory for each of
the three sites, is up to you.
Step 2:
Take the list of fetched cartoons and sort them by name then by date:
dilbert-03-01-08
dilbert-02-01-08
dilbert-01-01-08
userfriendly-03-01-08
userfriendly-02-01-08
userfriendly-01-01-08
Actual naming convention doesn't matter, as long as it's presented sorted
by name then by date - with date sorted descending.
Use the sorted list to produce a simple web page, with links to each of
the images. Save the page into a directory accessible by your web
server, such that anyone on your LAN can now browse those images from
your machine.
That's it, just a simple "get the strip and make it available to the LAN"
process. Nothing to it.
Remember, though, *I* don't get to use *any* GUI tools and the like, and
you don't get to use *any* CLI, OSS or scripting. Just GUI.
Three final conditions and we're ready to go:
1) Neither of us can use prefab "cartoon fetch" code or apps; this must
be strictly a case of solving each part of the job on its own, with its
own tools.
2) The end result must be executable without human intervention, via a
timed launch (eg 'cron') so it will fetch each day's cartoon
3) It must be easy for a user to add another cartoon site to be fetched.
That said, limits on format are acceptable: it is acceptable to restrict
the code to only fetch an image of the name "dilbert*" from the first
page loaded, for example.
Once we each have a working solution, we'll hand the details over to
third parties, to see exactly how much effort is involved in duplicating
the setup on their machines - everything from downloading, installing and
configuring the code to setting up the automated execution.
Why this challenge? Well, it's small and easily achieved, at least in
Linux with tools such as wget, but you decry that as not being user
friendly so this will let you show *exactly* how user friendly Windows is
doing the *exact same job* as Linux.
So, you up for it?
Yet you want to call your way "user friendly". A few minutes with some
basic skills, I can brighten several peoples' day - today, tomorrow, for
a year. How long does it take *you* to do the same job with your *easy*
tools, hmm?
Prediction: you won't go for it, because you *know* your "user friendly"
stuff is anything but.
|
|