Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

theodp writes "'The lack of interest, the disdain for history is what makes computing not-quite-a-field,' Alan Kay once lamented. And so it should come as no surprise that the USPTO granted Google a patent Tuesday for the Automatic Deletion of Temporary Files, perhaps unaware that the search giant's claimed invention is essentially a somewhat kludgy variation on file expiration processing, a staple of circa-1970 IBM mainframe computing and subsequent disk management software. From Google's 2013 patent: 'A path name for a file system directory can be "C:temp\12-1-1999\" to indicate that files contained within the file system directory will expire on Dec. 1, 1999.' From Judith Rattenbury's 1971 Introduction to the IBM 360 computer and OS/JCL: 'EXPDT=70365 With this expiration date specified, the data set will not be scratched or overwritten without special operator action until the 365th day of 1970.' Hey, things are new if you've never seen them before!"

They are deserving of the disdain and ridicule reserved for the Postal Office, Congress, etc.

Which is a shame because I've always figured they had some pretty smart people there. The examiner should have taken a shit on the application and mailed it back with a note saying,"this is what your application is worth".

They are either complete morons or...are getting payoffs. And Google will just use it as club some day on a small outfit that doesn't have half a million dollars to fight a lawsuit.

The "summary" wholly misrepresents what the patent is about. It's not about having an expiration date in the filename at all. When someone advocating a position lies to me, as this submitter did, I figure the reason they are lying about the issue is because they realize that the truth doesn't support their position.

Rather than choosing an expiration date ahead of time, the patented method deletes a file (or not) based on multiplying the time to live by the inverse of the user's quota usage, plus the latest of several modification times. The patent covers only using that specific algorithm, and only when the TTL is represented within the filename.

Is that algorithm obvious?
Several Slashdot commentors who say the are programmers read the explanation of the algorithm and still didn't understand it at all. One might say that if it's explained to you and you don't "get it", it's probably not obvious.

...
Is that algorithm obvious?
Several Slashdot commentors who say the are programmers read the explanation of the algorithm and still didn't understand it at all. One might say that if it's explained to you and you don't "get it", it's probably not obvious.

There's a fine line between clever and stupid. If an average programmer reads the explanation, and "Doesn't get it", it could be either. Most patents are very poor explanations for what they are about.

> There's a fine line between clever and stupid. If an average programmer reads the explanation, and "Doesn't get it", it could be either. Most patents are very poor explanations for what they are about.

But the "average programmers" here aren't motivated to try to understand it. They are motivated to find that the patent is worthless, because that's what the submitter wrote about it, and that's what they are predisposed to believe. So they are prone to glance at the application and say, "well, the cla

> When someone advocating a position lies to me, as this submitter did, I figure the reason they are lying about the issue is because they realize that the truth doesn't support their position.

I don't think it's flat-out lying. I think it's an example of the echo chamber effect.

The community believes that patents suck, that patent examiners are inept, and that patentees are using clever tricks to patent things that aren't new. So upon encountering any new patent, the submitters here don't do the har

I'm not even from the US but I'm sure the USPTO has thought of this since the start. They had to assume that the patents will come from all fields of technology so I don't expect there to be 20 people, the same 20 people, discussing ALL patents. I'm sure there are committees per various fields of science and technology.I'll even wiki it AFTER I have posted this, that's how sure I am of this:P

The process should be turned on its head. You're applying for a patent? It's up to you to prove you deserve to get it, and if it's found that you can't prove it or you fudged the attempt, you'll pay a fine for the privilege. Any of the patent examiners discovering a bogus application wins a bounty! Why the !@#$ wasn't it designed to work this way in the beginning?!?

It's just another version of a dog licence intended as a petty revenue stream. When a patent is granted that's proof of nothing other than the government is aware of it and has it on file - validity these days is apparently supposed to be sorted out in court and is none of the patent office's business.

Google only uses patents defensively, at least up until now. In a way it is better that such a ridiculous patent went to a non-troll company that won't use it to suppress the competition, if the USPTO is going to grant such nonsense.

Yes; maybe; and the whole summary is stupid. From claim one of the patent; the very first paragraph:

having a path name in a distributed file system, wherein the file is divided into a plurality of chunks that are distributed among a plurality of servers

So; where the mainframes of the 70s had single consolidated disks stores this is talking about doing this on a distributed filesystem. The area of application is indeed new completely opposite to the claim of the summary.

Patents are not supposed to control what you do; instead they control how you do it. Since the way that Google is claiming to do this is by going around comparing the timestamps on a bunch of different distributed chunks of a file, this is something that no mainframe of the 70s is likely to have had to to so it may even be a new way to automatically delete temporary files. I wish people would begin to understand this and commenters would point it out every time. I wonder if this isn't a bunch of patent lawyers trying to make us look silly.

Having said that; If you had a distributed file which kept a timestamp on each of several separate chunks, how would you go about deciding when to automatically delete it? My guess is that the solution you would come up with quickly is basically the one in the patent. You certainly wouldn't have great difficulty in deciding how you do it; suddenly think "maybe there's a patent that might tell me how to do this"; go to the patent office and read the patent then come back inspired and manage to solve your problem only because Google was so nice as to publish their solution. Patents are supposed to record valuable secrets that companies might otherwise keep to themselves in a way that helps humanity. This one is failing at that.

What this comes down to is that the whole idea of patents on things as abstract as software is stupid and is an illegal interference in free speech a right everyone should have under the universal declaration of human rights. The patent officers of the USPTO and the congressmen who put them there should be arrested.

The 'plurality' of chunks is irrelevant to the matter of expiring old data. What matters is that you have some sort of metadata telling the system when a blob of data that might resemble a file is to be deleted.

The USPTO, meanwhile, keeps making me feel like blowing a plurality of chunks into a water filled receptacle where upon manipulation of a lever or chain fresh water will enter the receptacle transferring momentum to the relevant chunks causing them to exit through a drain.

It's not prior art, it's obviousness. In terms of file storage, I consider myself "ordinarily skilled in the art", yet 5 years ago I put in such a system to expire files at work on a distributed filesystem. The problem is that the USPTO is allowing obvious stuff to be patented. They even admit as such - unfortunately I can't find the article - but I remember reading the USPTO saying "only 5% of patents they grant are what they call pioneer patents" (in other words, something really new and worthy of patent protection). The reform needs to be that only these "pioneer patent" applications actually get granted and the rest thrown out.

It's much worse that the USPTO approves trash like this. As long as they do, companies, trolls, and other profit seekers will take financial advantage of the stupidity. Meanwhile, the more patents the USPTO approves, the larger their budget becomes.

I am no particular fan of Google, but if the USPTO is approving this sort of thing, Google (and every other business) has to worry that some troll will land a patent on some basic part of their everyday operations, and if you can afford it, one defense is to attempt to patent everything that you do and use. They may have been as surprised as we are that it was approved.

For the USPTO this is a wonderful business model: do a crappy job and increase the demand for your services. Another recent example of this

If you actually read the patent, it is specifically for a similar method, but designed for Distributed File Systems. This is different from just a single file being names a certain way.
It is an algorithm based on the location of other related files, each different file's modified and Time to Live (TTL) dates, and the factors determined by the, keywords here, plurality of servers.
If they tried to patent a regular temporary file that would be different, but this is a distributed system specifically for a file that is distributed in different parts on different systems.
If you still think this has been done before, I would love to see the source for that information and gladly would recant myself given that.

Here is the crux of the first claim: "1. A computer-implemented method comprising: selecting a file having a path name in a distributed file system, wherein the file is divided into a plurality of chunks that are distributed among a plurality of servers, wherein each chunk has a modification time indicating when the chunk was last modified, and wherein at least two of the modification times are different; identifying a user profile associated with the file; determining a memory space storage quota usage for the user profile; deriving a file time to live for the file from the path name; determining a weighted file time to live for the file by reducing the file time to live by an offset, where the offset is determined by multiplying the file time to live by a percentage of memory space storage quota used by the user profile; selecting a latest modification time from the modification times of the plurality of chunks; determining that an elapsed time based on the latest modification time is equal to or exceeds the weighted file time to live; and deleting all of the chunks of the file responsive to the determining."

Can we please have an end to the stupid articles where someone intentionally mis-interprets the abstract or even just the title of a patent and pretends it's some simple thing that's been done for decades to try to drum up anti-patent sentiment? There seems to be one a week or so.

Can we please have an end to the stupid articles where someone intentionally mis-interprets the abstract or even just the title of a patent and pretends it's some simple thing that's been done for decades to try to drum up anti-patent sentiment? There seems to be one a week or so.

Unlikely.
Nonetheless, anti-patent sentiment is a good thing. Far too many people assume there's some sort of fairness or justice to the whole mess, and there isn't.

It's still a dumb patent; a trivial weighting addition doesn't change this. I mean, seriously, that's less complicated than your average photoshop filter, and it's an obvious "innovation" that any engineer would think up if they were to be asked to implement file expiration on Google's platforms.

Can we please have an end to the stupid articles where someone intentionally mis-interprets the abstract or even just the title of a patent and pretends it's some simple thing that's been done for decades to try to drum up anti-patent sentiment? There seems to be one a week or so.

Not until we have really stupid patents. I'm not a DFS guy, bu I am a computer guy. In every patent article there's one of you pointing out some supposed novelty. In my field, I've been through one or two and posted blow-by-blow re

If you actually read the patent, it is specifically for a similar method, but designed for Distributed File Systems.

Ahhhh... that's good.

You see, I was scared shitless that we are still quibbling over patents granted with the only claimed difference over some old methods (patented or not) being "on a computer".I see now how wrong I was: we stepped in the glorious era of the "in the cloud" claims.

Look, I know you're trying to be insightful and all, but this is Slashdot. We don't let silly facts get in the way of our unbridled hatred of all things Gub'mint!

It's a patent on storing files. Sure, it has some improvements that nobody's really used before that solve particular problems in a particular field, but I totally saw something similar sketched on the back of an envelope at my cousin's house in 1957, so this is blatantly obvious. I don't need to read the silly lawyer-speak claims to see that this

While the mechanism described isn't very original, Google has lawyers to cover their asses and patent it so that a few years down the trail when they've implemented it all through their cloud they can't get sued by Microsoft or some troll that went ahead and patented an equivalent method. Actually, if the PTO threw it out as unpatentable, Google would probably be just as happy, so they could use it without looking over their shoulders.
.

One doesn't get to patent filesystem features again (and especially not obvious ones) just because they're now "distributed".

Bwahahahahha. That's the industry term for "only the suckers will wait to re-file every known filesystem patent with 'on a distributed filesystem' added to the end." Referencing prior art is for weaklings since weak patents can still be used to extract medium-sized license fees from end-users before they're invalidated. Set up a good corporation to take the fall for it.

This is so true I have quite a few patents and I see it every day while doing art searches the number of patents claiming things that anyone with even a half way decent understanding or education in the field would recognize as already having been done "way back in the good old days".

Wow, Has anyone patented the concept of the old mainframe Generation Data Set recently? I used them extensively back in the mainframe days and could have used a similar concept in more recent systems, but never found a real substitute in either Unix/Linux or Windows. A simple explanation for those who have not heard of them is that they are sort of a push down stack of files managed by the OS with a fixed stack length. You could reference them by a long serial number that was of the format GnnnnVnnnn or

The same thing happened in the 80s and early 90s when microcomputers started gaining features like virtual memory, protected modes, out of order execution, etc... People thought these were all brand new things, when in fact mainframe processors had done all that 20 years prior in the 1960s.

I bet when all the kids were super-excited about programming on the i386 with its "OMG VIRTUAL MEMORY!!!" the older guys who had worked on mainframes just rolled their eyes.:)

Read about the IBM 360/91 if you want details on what I mean. It was amazing when you consider the year it came out.

I bet when all the kids were super-excited about programming on the i386 with its "OMG VIRTUAL MEMORY!!!" the older guys who had worked on mainframes just rolled their eyes.:)

You talking about the same old grey beards that gasped when the kids opened the cover on a server and added their own memory, network adaptors, backplanes, disk drives, etc without having to call IBM out to do it?

I bet when all the kids were super-excited about programming on the i386 with its "OMG VIRTUAL MEMORY!!!" the older guys who had worked on mainframes just rolled their eyes.:)

You talking about the same old grey beards that gasped when the kids opened the cover on a server and added their own memory, network adaptors, backplanes, disk drives, etc without having to call IBM out to do it?

Yeah, and then rolled their eyes again because the kids didn't know about change control, didn't notify the users about the outage, didn't verify that their backups were good (if they even had backups), and lost 6 months worth of corporate data as a result.

As an aside... I remember a few years ago - when we were still running tape backups - I went to one of our then-sysadmins and asked him to recover an important directory one of our faculty had managed to delete. I was told he couldn't do it because it would require they stop the backup system for several hours, which would throw their backup tape rotation scheme out of sync.

So we were continuously generating backups we could never actually use.

No, seriously - he didn't want to give me the files *at all*. I even told him I could wait a couple days... but it really was a bizarre case where he felt these tape backups (which, at the time, were our ONLY backups) were only for use if the building fell down. In that case, the tape set that had been rotated off site would be brought in (after purchasing all new tape drive and server hardware, of course), and used to reconstruct our servers.

I bet when all the kids were super-excited about programming on the i386 with its "OMG VIRTUAL MEMORY!!!" the older guys who had worked on mainframes just rolled their eyes.:)

Well, it was super-exciting to have it on the desktop for a reasonable price, yeah. I can't speak for everyone of that generation, but I appreciated it while still understanding perfectly well that it wasn't a new invention.

1. A computer-implemented method comprising: selecting a file having a path name in a distributed file system, wherein the file is divided into a plurality of chunks that are distributed among a plurality of servers, wherein each chunk has a modification time indicating when the chunk was last modified, and wherein at least two of the modification times are different; identifying a user profile associated with the file; determining a memory space storage quota usage for the user profile; deriving a file time to live for the file from the path name; determining a weighted file time to live for the file by reducing the file time to live by an offset, where the offset is determined by multiplying the file time to live by a percentage of memory space storage quota used by the user profile; selecting a latest modification time from the modification times of the plurality of chunks; determining that an elapsed time based on the latest modification time is equal to or exceeds the weighted file time to live; and deleting all of the chunks of the file responsive to the determining.

If the patent process were anything like the peer review process, a bunch of distributed filesystem engineers would have been asked how to implement file expiration, and their answer, within five minutes, would sound something very close to this.

But, more importantly, Google seems to have actually implemented this (not bad, considering the state of things). But who honestly believes they would not have done so without the hope of patent protection?

If I've got a lot of data, it means that data with the nearest expiration date *scaled by some function of the memory pressure*, is deleted.

So like any of the thousands of scripts sysadmins have written to check the output of df and run an rm on the temp directories with a -mtime value to find based on the result of the df. Hint: they're quite useful for maintaining large cache trees.

To anticipate the next patent: only take the action immediately if the return value from df is on the high side, otherwise

If you screwed up like this in company, you would be fired. Yet some dumbass government worker in the USPTO grants this and several million dollars later it gets sorted out by the courts. One of the reason patent litigation is out of control is because these dumbass don't do their jobs.

Worse, the USPTO is about to switch from first to invent to first to file. You don't need to invent any more. Just find out what your competitors are doing, patent it, and sue them out of business: http://www.jdsupra.com/le [jdsupra.com]

I think it's time for a crowdsourced patent challenge web site run by the USPTO where there would be a period of public comment for each patent about to be awarded in order to help underpaid (and I imagine under-resourced) examiners find Prior Art.

A lot fewer patents might be awarded, but ones that are would be genuinely new -- this might also save the world billions of dollars.

I think it's time for a crowdsourced patent challenge web site run by the USPTO where there would be a period of public comment for each patent about to be awarded in order to help underpaid (and I imagine under-resourced) examiners find Prior Art.

A lot fewer patents might be awarded, but ones that are would be genuinely new -- this might also save the world billions of dollars.

The problem here isn't the USPTO, it's the Patent Appeals Court that modified the Supreme Court decision (that an invention needed to be more than the sum of its parts), and decided that as soon as you'd been told about an invention, your judgement would be tainted by 'hindsight bias' and thus unable to determine prior art. So unless it's written down in that form, the patent should be awarded.

What are you talking about? There is no "Patent Appeals Court" in the United States. There is the Court of Appeals for the Federal Circuit. And the standard for non-obviousness was most recently articulated by the Supreme Court in KSR v. Teleflex [wikipedia.org] , a 2007 case in which the Court held that the precise prior art combination did not need to be explicitly "written down in that form":

As our precedents make clear, however, the analysis need not seek out precise teachings directed to the specific subject matte

...for extracting random phrases out of the middle of a patent document that match prior art and posting them to a web site in order to increase hit rates. Please delete this article or you will be hearing from my lawyers!

I make no claims to the validity of the the data, but the example given and the patent are different. The IBM 360 example is about *preserving files* by affording them additional protection, as opposed to the Google patent which is about *deleting* temporary files through adding a "time to live" value actiuallin in the directory/filename with various ways of cleaning out these files, as well as an *additional* indicator that it is a tempory file.

geez, when is slashdot ever gonna stop running these stupid articles that only show how little the posters know about patent lawor, at least, READ THE FILE WRAPPER 111MAYBE THE IBM PATENT IS AN X OR Y DOC IN THE SEARCH REPORT 111OR THE VERY LEAST, READ THE CLAIMS !!!

claim 1: A computer-implemented method comprising: selecting a file having a path name in a distributed file system, wherein the file is divided into a plurality of chunks that are distributed among a plurality of servers, wherein each ch

The NCAR Mass Store (tape archive) had an expiration period attribute (units of days) on the bitfiles. The default, if not specified was 30 days, which effectively made it a temporary file. Expiration periods of 31 days or more were considered more permanent, and the owners would receive email two weeks and one week before the projected expiration date arrived. Expiration processing was run each Sunday, and the bitfiles were moved into the trash, from which they could be recovered for another 30 days bef

I used to service Icon II's which used a primitive form of QNX. When the hard drive filled up, it would start deleting old files. In a school, the oldest unmodified file was usually the master password file. Since these systems didn't have a built in root login, this means they were self bricking.

Your point? I deal regularly with foreign born. My wife is foreign-born. My in-laws speak tamil and many of my ex-gf's families only spoke spanish. So, really, what is your point? Do you have anything intelligent?

Becoming a US citizen does NOT mean that you have command of the english language. If you look at the CVs of the inspectors, you will find that many of them have engineering degrees from China or India. Basically, they do NOT have the exposure to the same innovations that somebody raised here has.

As someone who works with Indian engineers daily, who lives with (and is about to marry) a Chinese engineer, and who is himself "foreign born"... You are so full of racist crap, I'd be afraid to kick you out my door for fear of ruining the carpet.

Sshhh don't tell him but third world South Africa was running it's social welfare program with a database on a mainframe back in the 1960s. Punch-cards and all.

I actually met the guy who was the chief operator/programmer on it once (in the late 90's), he declared in conversation that "I wrote code to manage a database of millions of entries on a computer with 64Kb of RAM that filled an entire floor in our building. The commodore 64 had the same memory and three times the CPU power in 1980 and you could carr

To be honest, that is not a bad idea. BUT, keep in mind that only solves IT. You have engineering as well. And in 20 years or so, you will want to have a number of asian-born inspectors. IOW, at this time, they are too early.

All I'm saying is that it seems to me that posting under a screen name isn't a whole lot different from posting as AC, in terms of willingness to reveal your identity and stand by your words. You may have a certain amount of reputation and karma to gain or lose, but that's about it. Now, when I signed on to/. using my real name, it wasn't a deliberate act of courage or anything like that--I just didn't even think about coming up with a screen name--

At one time, yes. I got my degree from Colorado State in Microbio/genetic engineering, worked at CDC, also worked as EMT. Then got into software engineering including at Metpath and CU-med. LONG ago.
in addition, we have emailed before.

One last thing, regardless of wether I operate under a pseudonym or my real name makes no difference to me. I express the same opinion all around. I have never had a friend call me racists or think that I was. The fact is, that I use a pseudonym because I was well aware that spammers were coming. My initial approach was to avoid them by not registering. Now, I use one pseudonym and try to keep it quiet (not like it is that hard, but I hate spammers). In addition, now, I have a sociopathic ex, so, I stick wi

Did you also inspect the quota for the user owning the file to determine if you should delete it?

Obvious enough that it's not patent-worthy.

Were the files also stored in a distributed file system, with chunks of the file on separate systems?

Doesn't seem to have any relation to the basic principle being patented - unless you're claiming that, after someone patented the wheel, I can come in and patent the use of wheel specifically on paved surfaces.

I wonder if google is going to chase me. I use that exact method for log file expiration in a program I wrote back in 1998 for scanning configurations across servers, From memory I also got the code for doing that from someone elses web site that had posted the sample. the log files are written to dated folders according to how long I needed to keep a record for of the specific scan being executed.

I doubt it, as the use case is completely different, you *manually selected what to delete" on log files on criteria you created, As opposed to Google who indicate a file is *temporary* by appending a file suffix/ changing a bit/ Mimetype, and then deleting it when...