Copying backup files

P

Philip Herlihy

One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.

Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.

However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.

I've been trying to come up with something he can run without having to
think about it too hard.

One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.

Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...

I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.

Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?

Any thoughts? Any hazards?
 
G

Gene E. Bloch

One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.

Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.

However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.

I've been trying to come up with something he can run without having to
think about it too hard.

One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.

Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...

I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.

Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?

Any thoughts? Any hazards?
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.

Both have been configured to look at alternative drive letters, in case
they have changed.

Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.

Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
 
P

Philip Herlihy

not- said:
I have two backup drives respectively marked A and B with tape labels
and in Macrium I created two backup scripts respectively labeled A and
B.

Both have been configured to look at alternative drive letters, in case
they have changed.

Unfortunately, the way all that is done is IMO not easy for a naive user
to set up. It's not that easy for me, and I claim to be experienced.

Also - I just tested - Macrium does allow me to access the files for
copying, if you prefer that method. The account is of class
Administrator, but it's not *the* admin account.
Thanks, Gene. I think Macrium is in most respects equivalent to True
Image, in that it's copying sectors rather than files (it's a cloning
tool) and it also seems to elevate itself (or get you to do it).

I've been experimenting, and I have a basic script which (if run
elevated) can access the files without problem and copy them to the
device on which the script resides, whatever its current drive letter.
You can use parameter substitution to tease the drive letter out of the
invocation string. So far, that looks like the best option I can think
of. If run without elevation, you simply get "Access Denied" errors.
 
G

Gene E. Bloch

Thanks, Gene. I think Macrium is in most respects equivalent to True
Image, in that it's copying sectors rather than files (it's a cloning
tool) and it also seems to elevate itself (or get you to do it).

I've been experimenting, and I have a basic script which (if run
elevated) can access the files without problem and copy them to the
device on which the script resides, whatever its current drive letter.
You can use parameter substitution to tease the drive letter out of the
invocation string. So far, that looks like the best option I can think
of. If run without elevation, you simply get "Access Denied" errors.
You say of Macrium, "it's a cloning tool". But it's two tools, a cloning
tool and an imaging tool. The imaging tool allows for incremental
backups, and allows for separate access to any dated image. They are
compressed images a la vhd files (I hope I recalled the extension
correctly), i.e., analogous to virtual hard drives.
 
P

Philip Herlihy

You say of Macrium, "it's a cloning tool". But it's two tools, a cloning
tool and an imaging tool. The imaging tool allows for incremental
backups, and allows for separate access to any dated image. They are
compressed images a la vhd files (I hope I recalled the extension
correctly), i.e., analogous to virtual hard drives.
Sure. I'm not entirely comfortable using sector-backup tools for
backing up ordinary files (though I'm not entirely sure why not). I'd
imagine if you did a defragmentation, the incremental backup next time
would be huge. It just seems so wasteful. I have to say, I liked
ntbackup, which goes back to Windows 2000!
 
G

Gene E. Bloch

Sure. I'm not entirely comfortable using sector-backup tools for
backing up ordinary files (though I'm not entirely sure why not). I'd
imagine if you did a defragmentation, the incremental backup next time
would be huge. It just seems so wasteful. I have to say, I liked
ntbackup, which goes back to Windows 2000!
My experience doesn't seem to agree with your speculation.

It might be informative to do an experiment: do an image backup, defrag,
and do an incremental right away.

Since Windows automatically defrags my computer when I'm not looking (I
never killed that default behavior), I should see some lengthy
incremental backups in Macrium, but I don't.
 
A

Ashton Crusher

One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.

Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.

However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.

I've been trying to come up with something he can run without having to
think about it too hard.

One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.

Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...

I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.

Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?

Any thoughts? Any hazards?
Maybe my memory is bad but I don't recall having any problem simply
copying my TrueImage backup files to a new location. AFAIK they are
just files like any other file.
 
G

Gene E. Bloch

My experience doesn't seem to agree with your speculation.

It might be informative to do an experiment: do an image backup, defrag,
and do an incremental right away.

Since Windows automatically defrags my computer when I'm not looking (I
never killed that default behavior), I should see some lengthy
incremental backups in Macrium, but I don't.
Well, what the heck. I decided to read the manual. In the program's
Help, which sends you to the Macrium website,

http://www.macrium.com/help/v5/reflect_v5.htm

Under Advanced Topics, I found this:

"When backing up a disk or partition you can select to do so using
Intelligent Sector Copy. This is the recommended method for creating
images. Using this method will result in smaller images and create them
with greater speed.

How does it work ? In the case of a full image, Macrium Reflect will
take a snapshot and then save only the clusters that are in use on the
disk. In the case of a differential or an incremental, after the
snapshot has been taken, Macrium Reflect will compare the clusters in
use with the previous image and then save only those clusters that have
been modified."

So I'm forced to agree with you, much as I hate to admit it :)

I had somehow believed that was the way cloning, and only cloning, works
in Macrium Reflect, but I haven't found documentary evidence that they
do cloning the way full images are described above. Of course, I think
they should do that.

When imaging, I still haven't seen huge incremental files or seen huge
backup times unless I have gone way too long since the last image
backup, which happens much too often.
 
R

Robin Bignall

My experience doesn't seem to agree with your speculation.

It might be informative to do an experiment: do an image backup, defrag,
and do an incremental right away.

Since Windows automatically defrags my computer when I'm not looking (I
never killed that default behavior), I should see some lengthy
incremental backups in Macrium, but I don't.
Neither do I using ShadowProtect. Although the SP people do warn
against doing an incremental during a defrag run, I've never seen any
problems.
 
G

Gordon

One of my customers (one of the rare few I can persuade to run backups
at all) wants to keep an additional copy on a further disk, just to be
on the safe side.

Now, you wouldn't want to re-configure the destination and run the
Backup program again, as that would break the sequence of Baseline and
Incrementals in the main backup location. So it's better to copy the
files.

However, those files are created with permissions which prevent the user
from accessing them (for understandable reasons) without changing
ownership and access rights - not something many people are comfortable
doing.

I've been trying to come up with something he can run without having to
think about it too hard.

One possibility is to run a baseline from time to time directed to the
additional location, immediately followed by a new baseline to the main
backup store - but that doesn't allow him to duplicate all the
subsequent inrementals.

Another is to use something like Acronis True Image (which somehow
manages to acquire all the permissions it needs) to clone the main
backup store to the additional location - possibly using True Image's
own Incremental facility. Hmmm...

I'm inclining to writing him a command-line script which will first
traverse the main backup store using takeown and icacls to make sure the
user has sufficient permissions, and then use something like Robocopy to
duplicate the files - Robocopy usefully skips files already copied. One
problem with such a script is that if the additional store is on USB it
may end up with a different drive letter (we don't have the Pro version,
so we can't use a network location). Maybe if the script resides on the
external disk I could get it to be drive-letter independent.

Or I guess we could enable the built-in Administrator account and have
him use that to run the copy. Actually - experimenting a bit - an
elevated command window does seem to be able to access these folders and
files, so maybe a plain Robocopy script (run with elevation) is the way
to go?

Any thoughts? Any hazards?
Will someone please correct me if I'm making a serious error in this
matter. I use three external 500 GB hard drives in separate DYNEX
cases. I back up or rather copy the entire Libraries folder from each
computer once a week. I just create a new folder on one of the
external hard drives, using the date as part of the folder name
(130122 Pavilion) for example. Then do a copy/paste from the
comptuer's Libraries into the hard drive's new folder. This gives me a
new copy of the Libraries folder each week or so and I have several
earlier copies remaining on these hard drives, just in case I run into
a problem. I can delete the older copies when the external hard drives
begin to get too full.

Is this not a workable, easy way to assure myself that I have safe,
secure backups? I put these backup external drives in separate
buildings, as a safeguard against some disaster like a fire or
tornado.
 
G

Gene E. Bloch

Will someone please correct me if I'm making a serious error in this
matter. I use three external 500 GB hard drives in separate DYNEX
cases. I back up or rather copy the entire Libraries folder from each
computer once a week. I just create a new folder on one of the
external hard drives, using the date as part of the folder name
(130122 Pavilion) for example. Then do a copy/paste from the
comptuer's Libraries into the hard drive's new folder. This gives me a
new copy of the Libraries folder each week or so and I have several
earlier copies remaining on these hard drives, just in case I run into
a problem. I can delete the older copies when the external hard drives
begin to get too full.

Is this not a workable, easy way to assure myself that I have safe,
secure backups? I put these backup external drives in separate
buildings, as a safeguard against some disaster like a fire or
tornado.
If you're sure that all the data you care about is in the libraries,
that should be OK.

But I find libraries a bit strange. They don't always do what I expect
of them, so I would suggest spending some quality time looking at one of
your backup folders to make sure it really does contain what you think
should be there.

All in all, clones and image backups work for me because they contain
everything on my hard drive, so if I need something that I didn't
anticipate that I might need, it will be there, assuming I can find it
:) ... I wouldn't rely on your method - but that's my prejudice (or
superstition) speaking.
 
P

Philip Herlihy

not- said:
Well, what the heck. I decided to read the manual. In the program's
Help, which sends you to the Macrium website,

http://www.macrium.com/help/v5/reflect_v5.htm

Under Advanced Topics, I found this:

"When backing up a disk or partition you can select to do so using
Intelligent Sector Copy. This is the recommended method for creating
images. Using this method will result in smaller images and create them
with greater speed.

How does it work ? In the case of a full image, Macrium Reflect will
take a snapshot and then save only the clusters that are in use on the
disk. In the case of a differential or an incremental, after the
snapshot has been taken, Macrium Reflect will compare the clusters in
use with the previous image and then save only those clusters that have
been modified."

So I'm forced to agree with you, much as I hate to admit it :)

I had somehow believed that was the way cloning, and only cloning, works
in Macrium Reflect, but I haven't found documentary evidence that they
do cloning the way full images are described above. Of course, I think
they should do that.

When imaging, I still haven't seen huge incremental files or seen huge
backup times unless I have gone way too long since the last image
backup, which happens much too often.
Thanks for this - it does seem to confirm my presumption, although I
note that you haven't experienced large incrementals as a result. I
still think it's a messy way to do file backups, though - although I
certainly recognise the value of a system image.
 
P

Philip Herlihy

....

Neither do I using ShadowProtect. Although the SP people do warn
against doing an incremental during a defrag run, I've never seen any
problems.
Interesting. Perhaps one of those 'theoretical' problems which doesn't
matter in practice.
 
P

Philip Herlihy

.

Maybe my memory is bad but I don't recall having any problem simply
copying my TrueImage backup files to a new location. AFAIK they are
just files like any other file.
I was talking about the file generated by Windows Backup, which are
indeed protected against access by ordinary users. True Image doesn't
do this.
 
P

Philip Herlihy

.
Will someone please correct me if I'm making a serious error in this
matter. I use three external 500 GB hard drives in separate DYNEX
cases. I back up or rather copy the entire Libraries folder from each
computer once a week. I just create a new folder on one of the
external hard drives, using the date as part of the folder name
(130122 Pavilion) for example. Then do a copy/paste from the
comptuer's Libraries into the hard drive's new folder. This gives me a
new copy of the Libraries folder each week or so and I have several
earlier copies remaining on these hard drives, just in case I run into
a problem. I can delete the older copies when the external hard drives
begin to get too full.

Is this not a workable, easy way to assure myself that I have safe,
secure backups? I put these backup external drives in separate
buildings, as a safeguard against some disaster like a fire or
tornado.
No serious error, but I think you could do better. Maybe I'm fixated in
an era when disk space really wasn't cheap, but you seem to be using
your available space wastefully. If you had to back up a file-server
containing masses of information, only a little of which changes much
(the most common situation), you would probably need to be economical
with the space you use, instead of just generating four full copies of
everything. Similarly, backing up using sector-copying tools seems to
be wasting the information which Windows uses (the 'Archive' flag) to
mark new or changed files. The backup schemes I use depend on this, and
they scale to large quantities of data, frequently backed-up.

A backup scheme also needs (in my view) to prioritise the most likely
causes of loss of data. In my situation, I think the most likely cause
is accidentally deleting or overwriting a file, followed by disk
failure. Fire, theft and a giant meteorite blowing up my part of South
East England are lower down the scale, but a failure to maintain regular
backups because you can't be bothered to go and get the disk out the
cupboard is a real risk. So, my preferred backup strategy is this:

I mount an additional disk in the machine to be backed-up. I use backup
software (that in Windows 2000 or XP Pro was fine, and the Vista/W7
backup software is also ok) to generate backups automatically on a
schedule. I can rely on this running, as the disk is always present. I
check the 'report' manually once a week (having once seen a colleague
ruefully surveying 6 months of unexamined failure reports!).

One one customer's site, the file server does a Baseline ("full") backup
once a month - note that a Baseline backup resets the Archive flag for
every file it touches, after copying it. Every weekday, a Differential
is done (only copies changed or new files, but doesn't reset the Archive
flags) with an Incremental (same but does reset the flag) every
Saturday. All of this is managed by a command-line script I've evolved.
The advantage of this scheme is that (provided you haven't had to delete
the files involved through running out of space) you can go back to the
version of any file which existed on any chosen date, yet for a full
restore you need only the Baseline, the latest Incremental, and any
subsequent Differentials to get back to the latest position.

Note that if you damage a file without realising it, and then copy the
damaged one over the most recent backup, you've lost the game - keeping
versions is important in some situations (think of an important
financial spreadsheet which is edited daily). Note also that these file
servers run XP or Windows 2000, which allow fine control of backup modes
- Vista and later only have Baselines and Incrementals, and the utility
decides which you get.

Now, that leaves you exposed to the risk that a PSU firestorm (or some
such event) could take out the whole machine, so it's well worth copying
the backup files (the complete, interdependent set) to another host. My
script uses robocopy to update a complete set of backup files held on
another machine. In this situation, the two machines are nearby,
leaving an exposure to fire, theft, meteorites, etc. I've discussed
this with the client, but he doesn't want to bother even moving the
second machine to a different room. It's for him to weigh the cost of
loss with his view of the likelihood of these events, so I can't insist.
I have encouraged him to keep critical files (like those spreadsheets)
on something like Dropbox or Skydrive or Google Docs, as an additional
precaution.

That file server holds a lot of stuff, much of it obsolete, of course,
but I can't be the one to weed it. A Baseline takes under 20m and an
Incremental takes less than a minute, copying between two internal
disks. The Robocopy job over the network takes a bit longer (no figures
to hand, but under an hour, max.) This scheme allows all user files to
be backed-up daily, and recovered from any point in time up to several
months back. Without using vast quantities of disk space.

I do something similar on my own PC. When doing serious software
development (not as often as I'd like, these days) I use version control
software which amounts to backing up many times a day.

Going back to my OP: my customer wants to do something similar on a
Vista PC (his idea - few of my domestic customers can be persuaded even
to think about backups, sadly). The one external disk is to be USB
(I've persuaded him to invest in a USB3 card to match the drive he's
bought) which raises the possibility of a changing drive letter, which
is a problem if seeming to automate the process. And Vista protects
backup files against access by the user who initiates their creation. I
now have a tested prototype script which resides on the destination
volume and detects its current drive letter, and, if run 'elevated' can
happily access and copy the backup files, using robocopy to mirror the
backup archive on the main backup disk. He's happy with that, and so am
I. If it wasn't for the need for elevation, I could even use an
autorun.inf file to run the copying script automatically when the drive
was plugged-in.
 
K

Ken Blake

A backup scheme also needs (in my view) to prioritise the most likely
causes of loss of data. In my situation, I think the most likely cause
is accidentally deleting or overwriting a file, followed by disk
failure. Fire, theft and a giant meteorite blowing up my part of South
East England are lower down the scale, but a failure to maintain regular
backups because you can't be bothered to go and get the disk out the
cupboard is a real risk. So, my preferred backup strategy is this:

I mount an additional disk in the machine to be backed-up. I use backup
software (that in Windows 2000 or XP Pro was fine, and the Vista/W7
backup software is also ok) to generate backups automatically on a
schedule. I can rely on this running, as the disk is always present. I

Each to his own, but in my view, backup to an internal hard drive is
better than no backup at all, but just barely. Next to backup to a
second partition on your only hard drive, it's the weakest form of
backup there is. It leaves you susceptible to simultaneous loss of
the original and backup to many of the most common dangers: severe
power glitches, nearby lightning strikes, virus attacks, fire, user
error, even theft of the computer.

Giant meteorites, on the other hand, are not terribly likely. <g>
 
K

Ken Blake

I do something similar, with three Samsung 1TB USB drives (F:). But
there are differences. My data is separated into system (C:), documents
(D:), and media (M:). My weekly backup to the currently-mounted F:
treats each drive letter differently. C:, which is the least important,
is backed up using Windows Backup, and there is only the latest copy. D:
is the most complex because I want multiple copies and I want
encryption. So my backup consists of a password-protected TrueCrypt
volume, and like you I put the date in the filename and prune when the
disk starts to run low on space. Rather than copy and paste I use sync
software (AJC Sync) which copies only new files and files where the
timestamp has changed. I also use AJC Sync to back up the media files on
M:, again only where the timestamp has changed.

The entire process is automated and runs under the Windows task
scheduler in the early hours of every Saturday morning, taking about
half an hour. I'm presented with a report on the success or otherwise of
the backups. I then rotate the disks, with there always being one
attached to the PC, one in my car, and one somewhere else.

I was about to disagree with your scheme, until I got to that last
paragraph above. What you say there is the key, and gives you
significant extra security.
 
P

Philip Herlihy

....

Devising and testing the software for automating all this wasn't trivial
but it pays dividends in reliability. The weakest link in the chain is
me, and the less I have to do the better.
+1!
 
P

Philip Herlihy

Each to his own, but in my view, backup to an internal hard drive is
better than no backup at all, but just barely. Next to backup to a
second partition on your only hard drive, it's the weakest form of
backup there is. It leaves you susceptible to simultaneous loss of
the original and backup to many of the most common dangers: severe
power glitches, nearby lightning strikes, virus attacks, fire, user
error, even theft of the computer.

Giant meteorites, on the other hand, are not terribly likely. <g>
Depends entirely on your assessment of the probability of those hazards.
As I've said, the likeliest problems are inadvertent deletion or
overwriting of a file, or failure of a single disk. And, of course,
inertia preventing you from getting the external disk and plugging it
in. For some people, a NAS is a good solution.
 
N

NY

Philip Herlihy said:
Depends entirely on your assessment of the probability of those hazards.
As I've said, the likeliest problems are inadvertent deletion or
overwriting of a file, or failure of a single disk. And, of course,
inertia preventing you from getting the external disk and plugging it
in. For some people, a NAS is a good solution.
Yes a NAS has the advantage that it can be physically remote from the
computer that is being backed up (less likely to be stolen or go up in
flames if the main PC is) and yet is always online and so you can't forget
to connect it when the automatic backup process runs.

I backup to external HDDs that are plugged into USB, and I use a
manually-initiated automatic backup program (MS Sync Toy) to compare PC
against backup drive and copy just the new/changed files. I definitely want
a backup which makes an exact copy of each file, rather than something that
merges the whole backup into one big proprietary file because it's a pain
searching through that if you want to restore a file, and if the file gets
corrupted you've lost everything. It's also why I prefer Outlook Express,
Windows Mail or Windows Live Mail over Outlook because each message is in a
separate file rather then putting everything into one humungous PST file.

Ideally I'd use a NAS in another room, but the problem with that is getting
a network signal to it: it's a pain getting Ethernet cable into other rooms
and wireless is horrendously slow for backing up multi-GB files (TV
recordings). Either way knackers the network during the backup, which is
fine if you schedule the backup for overnight, but I run SyncToy as and when
I think "right, I've changed or added some files - let's make a backup".

When I remember I remove the backup drives and keep them in another room
(guarding against casual theft, although not fire) and I take them with me
when I leave the house for a few days, both to keep them separate from the
main PC and so I've got access to files while I'm away from home, though for
trivial access I can also use LogMeIn back to the main PC - also useful for
setting extra TV programmes to record that I'd forgotten to set before I
went on holiday!

I agree that the most likely disaster is accidental deletion of files. I
deleted a folder of loads of TV programs; almost all were on the backup
drive but a few were new ones that I was about to backup but I was first
deleting (as I thought) unwanted programmes from the drive to be backed up
before comparing and backing up the rest - except I somehow deleted the
parent folder instead of a few of the files in it. And I'd already started
restoring the majority of the files from the backup before I realised that I
could use Recuva to undelete them - but by then it was too late for a lot of
them. Talk about compounding a problem through my own stupidity! In the
event, all I lost were about 10 programmes which I was able to download from
BBC iPlayer to watch, even if I no longer had a way to keep them forever,
only for the few weeks you get before iPlayer recordings time out.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top