Wikipedia:Peer review/Central processing unit/archive1

From Wikipedia, the free encyclopedia

Central processing unit[edit]

I just finished writing this article and have had it reviewed by a few people who are much more knowledgable than myself. It's on m:List of articles all languages should have, and was in a pitiable state before it was totally rewritten. My goal is to get this to featured article status, so I'd appreciate comments along those lines. Specifically, spelling/grammar fixes and suggestions on the general flow and scope (I did leave some discussions out of the article for brevity; see the talk page) are appreciated. Personally, I think the intro is somewhat lacking, so any suggestions there are also welcome. -- uberpenguin 02:41, 13 December 2005 (UTC)[reply]

The rest of the article is extremely technical (and, might I say, well-written); perhaps in the introduction, which will be the first (and perhaps only) part a lay-person reads, include a small segment about how the CPU interacts with the rest of a computer? As a lay person myself I think this is a decent idea, but being far from an expert on the topic, I may be totally off-base with this. Still, you've done a fabulous job of re-doing a rather weak article. Well done. Nach0king 00:42, 14 December 2005 (UTC)[reply]
I took a stab at writing a more accessible intro that ties CPUs into digital computers as a whole as well as their overall impact on technology and society. Tell me what you think. -- uberpenguin 14:45, 14 December 2005 (UTC)[reply]

Aww, come on friends, don't be scared of a little light reading. I know the article isn't perfect, and advice is really appreciated... -- uberpenguin 19:08, 19 December 2005 (UTC)[reply]

A nice article. I'd like to see some mention of Turing in it — the Turing Machine's processing unit is definitely the oldest CPU known — that's the first computer I know of where the storage and processing units are clearly distinguished. BACbKA 22:31, 20 December 2005 (UTC)[reply]
A Turing Machine is an abstract, mathematical model not meant to be physically implemented (and actually it cannot be implemented in the general case). It is different from a modern computer, both conceptually and as a level of abstarction, and has no processing unit (as opposed to, say, a transition table or function). Therefore, I see no parallel between a Turing Machine and a CPU as described above. --128.180.45.67 02:59, 21 December 2005 (UTC)[reply]
Agreed. While there might be something to be said about Turing completeness, I don't see any reason to diverge into mathematical models. The article is hard enough to keep down to a reasonable length without trying to diverge into general computational theory. -- uberpenguin 03:37, 21 December 2005 (UTC)[reply]
I made a few minor correcetions, but there are a few that I still think are appropriate.
Almost always, both "program" and "software" are used together. I would give up "software" for "program", since the latter presents a better generalized description at the functional level. I would, however, leave the first occurance of the two terms together in order to provide a clarification for the non-specialized reader in more common terms.
The History Section discusses the differences between the von Neumann and the Harvard design, stating that features of both can be seen in modern microprocessors. It makes it clear how the von Neumann design has influenced the modern designs, but it fails to explain or illustrate the infuence of the Harvard design.
The problem here is that one of the most important way Harvard appears in modern designs is in cache design. (others are things like stream processors, some MIMD designs, etc; which are a bit too specific to address in the article) Since this article is not about cache, it would be a bad idea to take a lot of space to diverge into cache design for the sake of elaborating upon that sentence. However, I don't want to remove reference to Harvard since it was a very important early design, and it does definitely pop up in modern designs. -- uberpenguin 20:23, 21 December 2005 (UTC)[reply]
The History Section uses terms like clock, clock frequency, synchronous, parasitic gate capacitance, etc., without explaining them in the context of (digital) circuits.
True. I added a note to see the below section on clock rate, as well as a terse indication of what parasitic capacitance is. Again, I don't want to devote much verbiage to either since its not really important to know what they are to understand the article. If you really think that mentioning parasitic capacitance is too distracting here, I'd rather remove it altogether than have to explain its significance. Maybe a footnote would be appropriate, but I don't really know... -- uberpenguin 20:23, 21 December 2005 (UTC)[reply]
The CPU Operation Section occasionally discusses specific hardware implementations (e.g. multiplexer, DRAM, etc.), that are not necessary for the description of the operation of the CPU.
You think so? I like to at least mention some common implementation details in parentheses so people who are interested in physical implementation have a quick reference for further reading. -- uberpenguin 20:23, 21 December 2005 (UTC)[reply]
I think you should keep it at the functional level in that section, perhaps discussing muxes and gates and whatever else in the Design and Implementation section. --Slavy13 20:50, 21 December 2005 (UTC)[reply]
Okay, I'll concede here... I'll remove those terms from that section since they don't really add much. -- uberpenguin 22:46, 21 December 2005 (UTC)[reply]
The subsections of Parallelism introduce technical details that I feel are redundant in this article.
Such as? I certainly cannot remove those sections; it would be a blaring omission to simply ignore all the enormous CPU design improvements made over the past three decades. I figure that keeping things at a simpler functional level early in the article will keep the lay man's interest, but switching to a more specific and perhaps technical tone towards the end will satisfy more advanced readers. -- uberpenguin 20:23, 21 December 2005 (UTC)[reply]
I should mention here that these sections are still written at an exetremely high level; I'm basically just mentioning some of the most common techniques for parallelism and their justification and purpose. -- uberpenguin 20:30, 21 December 2005 (UTC)[reply]
Perhaps the more advanced reader will seek technical satisfaction in the specialized literature that he or she undoubtedly will have access to. --Slavy13 20:50, 21 December 2005 (UTC)[reply]
By "more advanced," I mean someone with general acquaintance to computers, not any actual knowledge of the details of how they work. If I'm to write this article to only appeal to the lay man, where is the justification for writing ANY technically-inclined article on Wikipedia? -- uberpenguin 22:41, 21 December 2005 (UTC)[reply]
Ok, I see what you mean. Great job, by the way! --Slavy13 15:29, 22 December 2005 (UTC)[reply]
--Slavy13 19:51, 21 December 2005 (UTC)[reply]
Thanks for your input so far! -- uberpenguin 20:23, 21 December 2005 (UTC)[reply]