September 29, 2018
The University of Wisconsin won a big patent case against Apple in 2015, but it has now lost in the Federal Circuit Court of Appeals.
"We hold that no reasonable juror could have found literal infringement in this case" said the unanimous court, reported in U.S. News.
Subscribe to:
Post Comments (Atom)
22 comments:
The Wisconsin Alumni Research Foundation, or WARF, sued Apple in 2014, saying processors in Apple's iPhone 5s, 6 and 6 Plus smartphones infringed a 1998 patent describing a means to improve performance by predicting instructions given by users.
I coded that up in the late 70s. Try doing the same disk fetches that the program did a half hour ago, to eliminate I/O latency. It's too obvious to patent.
How can jurors be so stupid?
I have an advanced degree in computer science from the 1970s, and my initial instinct was to agree with rhhardin. But I would like to examine the precise algorithm in Prof. Sohi’s patent and compare it to the precise algorithm used by Apple in its iPhones. There are multiple ways to implement an idea, and if Prof. Sohi’s patent specified a detailed algorithm for implementing the idea, and if Apple’s implementation JUST HAPPENED to use the same algorithm, then the judges are wrong. Two other possibilities: Apple’s algorithm is different (judges are right, no infringement) of the patent was overly broad and only describes the general idea, then the US Patent Office is in error.
On that latter point, I am aware that just before the Civil War a man named Rollin White was allowed to patent the idea of revolver cylinders bored all the way through to accept metallic cartridges. Seems pretty obvious, but because of that patent the North went to war with repeating rifles and carbines like the Henry and Spencer, but only Smith & Wesson could produce handguns with metallic cartridges, and they only made small-caliber pocket guns.
FWIW, Wisconsin has a first-rate computer science department.
The question is how the patent office ever became so stupid. It used to be settled law that you could not patent a procedure or algorithm. Then the patent office started issuing patents for algorithms that supposedly were critical to some hardware patent. Now they will let you patent anything, they don't even read the applications any more.
"This case concerns a particular prediction method used to increase the accuracy of processor speculation such that mis-speculations are minimized."
Nonsense with sauce!
Apple had better lawyers.
"Anyone who desires to create wealth using patents is a fool." Orville Wright
Rat poison is called WARFarin for a reason.
It's too obvious to patent.
I thunk up a some new "C-cubed" (now C3I) display stuff that I thought was pretty obvious, but the bomb-factory (my term for "aerospace company" back then) got a patent on it.
Then at another place we got some threats from Evans and Sutherland about how we might be violating their re-entrant polygon clipping patent, which they described, and we weren't violating it until I swiped it because it was pretty simple and better than my method.
This is an embarrassment. Who was the appellate attorney who represented the University?
To start off, the patent wasn't claiming speculation, per se (which I also worked with in the latter 1970s), but rather one very specific speculative technique. As I understand it, the WARF patent claimed assigning a speculative predictive index to unique memory addresses, while the Apple processors utilized a 4,096 entry hash table, and thus only provided 4,096 speculative predictive indexes, based on the hash of the memory addresses being loaded and stored. That multiple addresses hashed to individual speculative predictive indices was held, by the CAFC, to mean that the Indices were not associated with unique memory addresses, and, thus no direct infringement. There might have been infringement under the Doctrine of Equivalents, (DOE), but that was never addressed by the Court, which presumably means that. It wasn't claimed by WARF at trial.
Patent litigation is weird and hyper-technical. Interpreting claim language is a legal question, which means that it has to be defined by the trial court before a jury can address whether or not the alleged infringing device actually infringes. This is typically done in a Markman hearing. This can require expert witnesses. Then, in the actual trial, the jury gets to listen to expert and fact witnesses on both sides, and determine first whether there was literal infringement, and if not, then possible DOE infringement, etc. meanwhile, they are also asked to determine validity, which has its own rules (validity was upheld here by the CAFC). No wonder the average patent case now runs over $1 million, up from the maybe $250k from when I entered the field almost 30 years ago (for one thing, that was before Markman hearings). That sort of cost is easy for a plaintiff to justify in the case, as here, of a processor used in Apple devices. Much less for infringement of most patents.
@Brice Hayden, well that clarified it for me — seriously. I agree; no infringement.
Bruce Hayden, I defer to you and those like you who work in this very recondite area. From your 11:48 comment I guess the WARF patent attacks the prediction problem from one end while the Apple patent attacks it from the other end? In the space of innovation such collisions are likely, and maybe nobody is "right."
But on the larger question of patents on processes: ay yi yi. The "business methods" patenting vogue was a huge departure from old philosophies, no? We were suddenly instantiating a patentable "thing" as a process. As Hegel said, architecture is frozen music, i.e. a machine is a particular instance of a process. So when we get a general-purpose machine (computer) to perform an algorithm, we have --for that instance-- converted the machine into a patentable invention, no?
Here's a program to free a binary tree into a freelist linked by left node, that's as far as I know, new, free for somebody to patent.
What's new is no recursion so a huge, huge tree doesn't run the stack into the ground.
https://rhhardin.blogspot.com/2011/12/xmas-code.html
In the past I have done expert witness work involving software patents. The patents I have dealt with were laughable. They did not contain a detailed algorithm but rather a high level description of a procedure that could pertain to almost anything. Even worse there was prior art for the claimed infringement. Too bad we can’t go back to the time when you had to include a working model with your patent claim.
FYI
Case: http://www.cafc.uscourts.gov/sites/default/files/opinions-orders/17-2265.Opinion.9-28-2018.pdf
Patent: https://patents.google.com/patent/US5781752A/en
Claims:
1. In a processor capable of executing program instructions in an execution order differing from their program order, the processor further having a data speculation circuit for detecting data dependence between instructions and detecting a mis-speculation where a data consuming instruction dependent for its data on a data producing instruction of earlier program order, is in fact executed before the data producing instruction, a data speculation decision circuit comprising:
a) a predictor receiving a mis-speculation indication from the data speculation circuit to produce a prediction associated with the particular data consuming instruction and based on the mis-speculation indication; and
b) a prediction threshold detector preventing data speculation for instructions having a prediction within a predetermined range.
The bolded portion of Claim 1 is the claim element in question. The CAFC determined that the processor used in the Apple devices did not, in fact, produce a prediction associated with the particular data consuming instruction, because a specific prediction could be associated with multiple addresses (that can hash to the same hash key). Essentially roughly a half billion dollars switched sides based on the interpretation of that one word (“particular”).
the processor further having a data speculation circuit for detecting data dependence between
Regretter circuits are handy too, for when the speculation fails.
Bruce Hayden
The CAFC said that no reasonable jury of Badgers could find for WARF given the facts and law, IOW, they were dead wrong. That’s a bold statement in our jury system.
I think this is what happens when you paten 'patently obvious' software ideas. You can come up with a idea today which is obvious, but has no practical application due to lack of hardware processing power. Hardware will advance to the point where said idea or application can be implemented.
Few people, though, can understand the coding process and distinguish between application of an idea and magic. Pick your lawyers and jurists accordingly.
“The CAFC said that no reasonable jury of Badgers could find for WARF given the facts and law, IOW, they were dead wrong. That’s a bold statement in our jury system.”
Yeh, well, they had to find that in order to overturn a jury verdict. But isn’t what you are essentially saying is that you can’t find a reasonable jury of Badgers, period, when it comes to their school? The problem is that the CAFC is sitting there in DC, unaware of local WI sentiments.
What I would be interested in finding out is why the Doctrine of Equivalents (DOE) was not plead, and/or appealed. The Apple mechanism seems equivalent, but, in my view, probably superior, since it doesn’t really require that prediction table entries be retired. Plus, the WARF technique requires a table search, whole the Apple technique uses hashing and direct indexing, which should be faster. The one argument against the DOE is that both techniques were well known in the software arts well before this patent was filed (the novelty was using these techniques for data speculation control). The engineering society IP committee I sit on filed a Supreme Court brief several decades ago involving the DOE, and that Court essentially used our logic to essentially reject DOE if the difference between patent claims and the allegedly infringing device involved technology well known at the time the patent was written, and that may have been the problem here. The key was foreseeability - Could the patent inventors have reasonably foreseen this equivalent? If they could have, no DOE.
“Regretter circuits are handy too, for when the speculation fails.”
Not sure if this is what you were driving at, but that seems to have been covered in the subject patent. The speculative results are, of course, squashed, but the count of failures is also incremented (or count of successes is decremented). These counts are the ones used for future speculation.
“The question is how the patent office ever became so stupid. It used to be settled law that you could not patent a procedure or algorithm. Then the patent office started issuing patents for algorithms that supposedly were critical to some hardware patent. Now they will let you patent anything, they don't even read the applications any more.”
Software essentially became patentable in the early 1990s, and mostly unpatentable a bit over 20 years later, with the Supreme Court’s Alice decision. We filed an Amicus brief with that case too, representing hundreds of thousands of hardware and software engineers. We carefully explained why banning software patents as constituting abstract ideas (and thus being equivant to laws of nature) was silly and ignorant. This time, that Court ignored our brief, and went back to their 1970s era jurisprudence.
One basic problem is that there is a lot of overlap between hardware and software. If an algorithm can be expressed in software, it can often be expressed in circuitry. Indeed, in a number of situations, algorithms may start in pure software, move maybe into PLAs, etc, and then into custom circuitry. I should note that in my first software patent, in the early 1990s, I was given an algorithm, developed a circuit that implemented it, showed a flowchart to make understanding it easier (than trying to figure out my circuit diagrams), and then, at the end, mentioned that it could also be implemented in software.
I should add that the patent here repeatedly pointed out that while the technique could be implemented in software (PLA, etc), it was best implemented in circuitry, because it is so central to processor throughput.
Post a Comment