Talk:List of former IA-32 compatible processor manufacturers

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Is this a list of former IA-32 compatible manufacturers, or former x86 compatible manufacturers?[edit]

This article lists, in the "Product discontinued/transformed" section, vendors who supplied 8086-compatible and 80286-compatible processors; those were not IA-32 processors, they were "IA-16" processors.

Is the intent to list all former x86 manufacturers, or just those who made 32-bit processors? Guy Harris (talk) 19:34, 26 April 2017 (UTC)[reply]

Guy, here, you confused again, and never ever made clear those terms. The Intel IA-32 architecture processors include early 16-bit processors, such as 8086/8088, 80186 and 80286. Those 16-bit enhanced processors are the early IA-32 processors. There is nothing like Intel IA-16, it does not exist at all. — Preceding unsigned comment added by 119.53.111.113 (talk) 14:31, 27 April 2017 (UTC)[reply]
100% wrong. If you believe that the 8086 is an IA-32 processor, you have absolutely no clue whatsoever what IA-32 is, or even what the "32" stands for there. Guy Harris (talk) 16:59, 27 April 2017 (UTC)[reply]
Intel are themselves a bit confused on this. See Chapter 2 of Intel® 64 and IA-32 Architectures Software Developer’s Manual, Volume 1: Basic Architecture, which says:

The IA-32 architecture family was preceded by 16-bit processors, the 8086 and 8088.

which indicates that they aren't IA-32 processors, but also say

The 8086/8088 introduced segmentation to the IA-32 architecture.

which speaks of them as if they were. (Of course, saying the 8086 "introduced" anything to the instruction set architecture is a bit odd, given that the x86 ISA didn't exist before the 8086, so that part of the manual really could have used a technical editor.) Guy Harris (talk) 17:51, 27 April 2017 (UTC)[reply]
No, you confused yourself. Those enhanced 16-bit processors are the early IA-32 processors, and are belong to IA-32 processors. Those processors which enables Intel 64 are also IA-32 processors, including today's Intel Core i7. They are all the IA-32 processors. As to this information, you can also find clues on those Intel Official documents. The reason is obvious that Intel all the time treat the AMD64 architecture as an extension of IA-32 architecture, rather than a standalone one. so early 16-bit processor even does not support 32-bit general computing, but they are IA-32 processors; likewise, even though those Intel 64 processors could manipulate 64-bit general computing and use another similar 64-bit Instruction Set, but Intel intentionally treat them as enhanced IA-32 processors. Or in other words of your fellows, Intel treat 8086/8088, 80186 and 80286 as the 16-bit version of IA-32 processors, and Intel 64 processors as 64-bit version of IA-32 processors too. Intel never made any mistake on their documents, and this not a misleading, just a very different view towards things made by themselves.
This article stands on the side of Intel product, and Intel is the inventor and creator of the architecture of IA-32. Those clones are intentionally designed to be compatible with software and hardware designed for Intel processors. So the term, IA-32, used here are definitely exact as what they represent. On the other hand, x86, is a neutral term, those clone processors are the x86 processors; but this article does really emphasise the Intel product. x86-64 or AMD64 is designed by AMD/DEC Alpha, rather than Intel, and Intel should not call their Intel 64 processors as clones of AMD64. Because AMD64 essentially are also the clones of Intel IA-32 processors. This paradox enables Intel switch their focus from architecture design to the design of core of processors, without losing their fame on the de-facto processor king.
In Intel documents, Intel calls the 64-bit mode as the IA-32e mode, rather than Long Mode found on AMD64 processors. This also indicates that Intel 64 are also IA-32 processors, but an enhanced IA-32 processor, comparing with Intel 80386. The motivation of Intel mimicked AMD64 ISA onto their own processors are a little bit different from that of AMD cloned Intel processors long before AM386. Intel used AMD64 ISA or incorporated their ISA onto their enhanced IA-32 processors are forced by those software vendors especially Microsoft, they refused to develop a third 64-bit version of Windows rather than only 64-bit version of Windows XP. They refused to invest on research and development of Intel Yamhill processors, as they saw the obvious wrong investment onto the Itanium products. Losing the support of Microsoft, Intel seek another opportunity from Apple. That is the very reason that Apple developed their Intel Mac products based on 32-bit version of IA-32 initially, at that time most Pentium 4 and Celeron were already 64-bit enabled, but they did not start their project onto the AMD64 architecture from the beginning, and emphasising the Intel architecture all the time, even though when 64-bit was enabled on the Mac OS X 10.5. They still refer to that architecture as Intel 64-bit architecture.
What is that Intel 64-bit architecture? Running 64-bit applications on the 32-bit kernel did really confuse most of then purchasers believing that it is another 64-bit architecture invented and/or developed by Intel rather than AMD64. But that is not a misleading but a reserved term, or water-testing term and thing! Intel wished that Apple would take advantage of the 64-bit architecture found on rumoured Yamhill project, but Apple always has a voice to the outside world, they might possible use AMD chips in near future! Because Apple understand they are not the only big client of Intel products, even though Microsoft refused to support their own in-house 64-bit architecture, but most OEM/ODM were in favour of Intel chips rather than AMD's. Apple just told this fake stories all the time, then and now. But they would never use AMD processors, because they do not need! They need time to bring out their own processors, and those are based on ARM64 rather than PowerPC from Motorola and IBM. Intel processors are only the feeding helping them to pass those special honey moons. So lacks of really support of their 64-bit architecture, Intel has only one choice to incorporate the AMD64 ISA onto their enhanced IA-32 processors.
The motivation of AMD cloning Intel IA-32 is obvious that they did not possess their own ISA at all, even the AMD64 architecture are also based on the IA-32 ISA. So for this very difference, I do believe Intel intentionally call those processors, from 16-bit to 64-bit all the IA-32 processors. 221.9.17.84 (talk) 01:59, 28 April 2017 (UTC)[reply]
"Those enhanced 16-bit processors" The 8086 cannot be an "enhanced 16-bit processor" - there's nothing for it to be an enhancement of! It was the first x86 processor. As for whether it's IA-32 or not, Intel says "The IA-32 architecture family was preceded by 16-bit processors, the 8086 and 8088.", which means that the 8086 and 8088 were not members of "the IA-32 architecture family" - if they had said "The IA-32 architecture family began with 16-bit processors, the 8086 and 8088", that would be different.
"Those processors which enables Intel 64 are also IA-32 processors, including today's Intel Core i7. They are all the IA-32 processors. As to this information, you can also find clues on those Intel Official documents." [citation needed]. Point to a place in Intel's documentation where Intel says that all Intel 64 processors are IA-32 processors - not that all Intel 64 processors are backwards compatible with IA-32 processors, capable of running all software that runs on IA-32 processors, but also capable of running 64-bit software.
"The reason is obvious that Intel all the time treat the AMD64 architecture as an extension of IA-32 architecture, rather than a standalone one." Just as AMD does. To quote AMD64 Architecture Programmer’s Manual, Volume 1: Application Programming:

The AMD64 architecture is a simple yet powerful 64-bit, backward-compatible extension of the industry-standard (legacy) x86 architecture.

and

The need for a 64-bit x86 architecture is driven by applications that address large amounts of virtual and physical memory, such as high-performance servers, database management systems, and CAD tools.

Nobody with a clue ever thought of x86-64/AMD64/Intel 64 as a standalone architecture; it's obviously an extension of the 32-bit x86 architecture.
"so early 16-bit processor even does not support 32-bit general computing, but they are IA-32 processors" So what does the "32" in "IA-32" refer to? If the 8086 was an IA-32 processor, it obviously has nothing to do with "32 bits". The x86 architecture started out as a 16-bit architecture with no virtual memory support. The 80286 added segmented virtual memory, but it was still a 16-bit architecture, with 16-bit segment offsets. The 80386 was the first 32-bit x86 processor, which also added paging as well as segmentation. The next major architectural change to x86 was the addition of 64-bit support.
"x86-64 or AMD64 is designed by AMD/DEC Alpha, rather than Intel, and Intel should not call their Intel 64 processors as clones of AMD64." (No, the Alpha instruction set contributed little if anything to x86-64. Some of the chip designers from DEC may have worked on the K8, and the Alpha bus design was used, but the instruction set came from AMD.) The 64-bit part of Intel 64 is, by and large, a clone of AMD64. The 32-bit instruction set on which x86-64 is based was designed by Intel, but the 64-bit extensions were designed by AMD, and Intel licensed them (apparently AMD and Intel signed a patent cross-licensing agreement).
Intel most definitely describes Intel 64 as a different architecture from IA-32; chapter 2 of Intel 64 and IA-32 Architectures Software Developer’s Manual, Volume 1: Basic Architecture is called "Intel 64 and IA-32 Architectures", plural.
"Losing the support of Microsoft, Intel seek another opportunity from Apple." If you're saying Intel wanted Apple to port Mac OS X to Intel's alleged non-AMD64-based 64-bit extended x86, you'll need to provide a citation for that, just as you will if you're saying Apple did port to that instruction set. Guy Harris (talk) 08:09, 28 April 2017 (UTC)[reply]