[SPAM 0.6] Re: What about 4 terabytes? RE: 46 gigabytes

From: Roy V Zimmer <roy.zimmer_at_nyob>
Date: Fri, 13 Jan 2017 14:00:55 -0500
To: CODE4LIB_at_LISTS.CLIR.ORG
Spam detection software, running on the system "avery.infomotions.com",
has identified this incoming email as possible spam.  The original
message has been attached to this so you can view it or label
similar future email.  If you have any questions, see
eric_morgan_at_infomotions.com for details.

Content preview:  Step 2 should be relatively easy, Ray, as such drives are
  readily available these days at decent prices. Step 3 could be a stumbling
   block...AWS comes to mind, but I've no experience with that. Roy On 1/13/2017
   1:41 PM, Schwartz, Raymond wrote: > I found this discussion very informative.
   But I would like to change a parameter from 46gb to 4tb. What affordable
  and simple options are there for that amount of data? > > /Ray > > > 

Content analysis details:   (0.6 points, -1.0 required)

 pts rule name              description
---- ---------------------- --------------------------------------------------
 3.0 SINGLE_HEADER_3K       A single header contains 3K-4K characters
-0.0 SPF_PASS               SPF: sender matches SPF record
 0.0 HEADER_FROM_DIFFERENT_DOMAINS From and EnvelopeFrom 2nd level mail
                            domains are different
-0.6 RP_MATCHES_RCVD        Envelope sender domain matches handover relay domain
-0.0 SPF_HELO_PASS          SPF: HELO matches SPF record
 0.0 T_HEADER_FROM_DIFFERENT_DOMAINS From and EnvelopeFrom 2nd level mail
                            domains are different
-1.9 BAYES_00               BODY: Bayes spam probability is 0 to 1%
                            [score: 0.0000]



attached mail follows:


Step 2 should be relatively easy, Ray, as such drives are readily 
available these days at decent prices.
Step 3 could be a stumbling block...AWS comes to mind, but I've no 
experience with that.

Roy


On 1/13/2017 1:41 PM, Schwartz, Raymond wrote:
> I found this discussion very informative.  But I would like to change a parameter from 46gb to 4tb.  What affordable and simple options are there for that amount of data?
>
> /Ray
>
>
> -----Original Message-----
> From: Code for Libraries [mailto:CODE4LIB_at_LISTS.CLIR.ORG] On Behalf Of Kyle Banerjee
> Sent: Tuesday, December 13, 2016 6:05 PM
> To: CODE4LIB_at_LISTS.CLIR.ORG
> Subject: Re: [CODE4LIB] 46 gigabytes
>
>> Taking things like cost, convenience, and the knowledge that my
>> solution will always include migrating forward, there is what I think I will do:
>>
>>    1. buy a pile o’ SD cards, put multiple copies
>>       of my data on each, and physically store
>>       some here and some there
>>
>>    2. buy a networked drive, connected it to my
>>       hub, and use it locally
>>
>>    3. break down and use some sort of cloud
>>       service to make yet more copies of my data
>>
>>    4. re-evaluate in 365 days; this is a never
>>       -ending process
>>
> As is this for personal data and there isn't that much of it, there are many paths that will work including the above.
>
> We have a cultural bias towards squirreling away copies all over the place.
> The advantage is that it's impossible to lose everything. The disadvantage is that it's labor intensive, scales poorly, and synchronization as well as knowing what you can really trust (completeness, integrity, etc) is an issue.
>
> Given that you can recover deleted files as well as restore previous versions from services such as Google Drive, Dropbox, etc, there's no real reason to keep copies on so many cloud services. You can just use one which can be accessed from your personal computer, cell phone, and over the web
> -- this will be far more convenient and reliable/safe than any solution involving personal hardware.
>
> kyle
Received on Fri Jan 13 2017 - 14:01:07 EST