Click Here
home features news forums classifieds faqs links search
6129 members 
Amiga Q&A /  Free for All /  Emulation /  Gaming / (Latest Posts)
Login

Nickname

Password

Lost Password?

Don't have an account yet?
Register now!

Support Amigaworld.net
Your support is needed and is appreciated as Amigaworld.net is primarily dependent upon the support of its users.
Donate

Menu
Main sections
» Home
» Features
» News
» Forums
» Classifieds
» Links
» Downloads
Extras
» OS4 Zone
» IRC Network
» AmigaWorld Radio
» Newsfeed
» Top Members
» Amiga Dealers
Information
» About Us
» FAQs
» Advertise
» Polls
» Terms of Service
» Search

IRC Channel
Server: irc.amigaworld.net
Ports: 1024,5555, 6665-6669
SSL port: 6697
Channel: #Amigaworld
Channel Policy and Guidelines

Who's Online
22 crawler(s) on-line.
 95 guest(s) on-line.
 0 member(s) on-line.



You are an anonymous user.
Register Now!
 Hammer:  6 mins ago
 cdimauro:  12 mins ago
 bennymee:  13 mins ago
 bhabbott:  43 mins ago
 MEGA_RJ_MICAL:  59 mins ago
 DiscreetFX:  1 hr 39 mins ago
 AmigaMac:  2 hrs 25 mins ago
 agami:  2 hrs 37 mins ago
 logicalheart:  2 hrs 58 mins ago
 coder76:  3 hrs 15 mins ago

/  Forum Index
   /  General Technology (No Console Threads)
      /  Commodore > Motorola
Register To Post

Goto page ( Previous Page 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 Next Page )
PosterThread
cdimauro 
Re: Commodore > Motorola
Posted on 20-Apr-2025 9:13:59
#121 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@codis

Quote:

codis wrote:
@Lou

Quote:

So yes, once again - I re-iterate that Amiga on 65816 would have naturally led to more success and eventually ARM instead of 68K and PPC death.


I doubt that.
In fact, the 65816 suffers from the same "diseases" as the x86, not to mention that 16-bit was predicatably a dead end in the PC niche already at that time.
One problem is the architecture, which is mostly geared towards single-address instructions. This made sense with early 8-bit processors with severe technological constraints. Having an implicit address (the accu register) for many instructions shortened those instructions and thus increased fetch throughput. OTOH it required a lot auxiliary instructions just to move values /results around into other registers.
The address space segmentation is another significant disadvantage it shares with all other 16-bit processors and MCUs. I am still occasionally dealing with this issue on MCUs. Especially with larger application requiring more than 64k, the memory model issues become a real nightmare.

In that light, a 8086 (or 286) would have been an equivalent, if not better choice.

Even the 8086, which was released SEVEN years before the 65816, was much, much better.

In fact, it had many more registers and it was much easier to handle both 8 and 16-bit applications without the need of crappy SEP/REP instructions to select the data type size.
In general, handling applications on a 16-bit address space (64KB at the time) was way way better on 8086, which also have shown the best code density (I think that it's still unbeatable in this domain).

But even going beyond 64KB, 8086 had FOUR registers for referencing each a 64KB segment, whereas 65816 had a single register for selecting the 64KB bank, and just a couple instructions for changing it (using the accumulator as a temporary registers. Which is even worse, of course: you not only need additional instructions -> lower performance, but this precious registers was wasted with dirty data).
Last but not really least, 8086 had instructions for loading 16:16 (segment:offset) "far" pointer in one show, as well as instructions for directly calling or jumping to far addresses, even with indirect memory. So, it's also able to manipulate complex structures (scattered around the entire memory) as well as using code in the entire memory, in a much much more efficient way compared to the crappy 65816.

The ONLY advantage of 65816 was its ability to reference up to 16MB of memory.

However, Intel wasn't certainly sleeping, and it introduce the 80186 (better performance) on 1982 and 80286 (24-bit address space with full, extremely solid, memory protection) on the same year, and the 80386 (48-bit virtual address space, 32-bit physical memory, better performance, much better instructions) on 1985.

The 65816 only appeared on 1985, and the comparison is merciless to say the least...

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 20-Apr-2025 10:01:16
#122 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@Lou

Quote:

Lou wrote:
@Kronos

Quote:

Kronos wrote:
@Lou

1st you would need to establish a timeline where either HiTorro gets early access to the chip and sees it as superior to the 68k.
Or one where C= spend the resources to reengineer the A1000 and AOS from 68k to 65816 without even more delays.
Thats is a hard.

Next you have to remember that a big part of the A1000s (limited) early success was the combo of a multitasking OS in non segmented memory.

Now you'd have to find a way to establish a healthy market for accelerators based on an alien ARM chip, not readily available 020/030 which C= eventually followed up with the A2500 and A3000.

We are in nightmare mode.

More plausible:
The Amiga would have morphed into something like the C65 with AOS being an optional extra and no successor.

An Amiga is just a slower ST with custom chips.

"just"? The custom chips were the primary actor on the Amiga, with 68k just following.

Just...
Quote:
How well did 68k->PPC work out?

Not so good, and guess what: because Motorola decided to kill his beautiful baby.
Quote:
That's my point about the cpu...does the cpu make the system? No.

As I've already said, it depends on the system.

Specifically, 68k was THE (read the uppercase) PERFECT (again, and look at the English vocabulary as well) CPU for the Amiga. Dot.

And do you know why: because the chipset was designed AROUND the 68000.

The chipset was the most important element, but it works in symbiosis with the 68000 in a PERFECT MATCH.

Without the 68000, the chipset would have been quite different, and very likely without the same efficiency (considering the processors available when the chipset was designed: 1982).
Quote:
68k is not magical.

It is, it is. But nothing that blind Talibans like you could appreciate.
Quote:
There are pre-emptive multi-tasking OS(es) available for the 8/16bit cpus.

Which doesn't make such systems better: they remain pure crap.

Unsuitable and underperforming for a multitasking OS, especially if we talk about the Amiga OS. There's no way that such crappy 65xx could have done better than even a plain 7Mhz 68000.
Quote:
Commodore's engineers were pretty good at making custom 6502's. Better than Bill Mensch I'd argue.

History proves the exact opposite: Mensch delivered way better processors, whereas Commodore engineers were barely able to copy Mensch's 65C02 with some small changes and performance improvements.

In the meanwhile, and BEFORE the 65EC02, Mensch produced the 65816: simply Sci-Fi for Commodore's engineers...
Quote:
68000 was an expensive boat anchor.

It was expensive because it wasn't a crappy 8-bit processor only good for being used as a microcontroller.
Quote:
The 65CE02 was amaZing

YOUR wishful dream.

If you lift a bit something which is crappy, it's still crap.
Quote:
but they gimped it by making it pin-compatible with a 6502 hence the need for an MMU for larger memory addressing.

Pin-compatibility wasn't certainly the problem here.
Quote:
They gave it 2 more registers, a 16bit stack pointer, 5 16bit instructions to manipulate WORD(s) in memory, relocatable base-page addressing (via new B register vs MMU on C128) to speed up memory r/w even more and extended branch jumps to 16bit addresses.

Nevertheless, it remains a crappy 8-bit processor which was very late in time (industry was already moving to 32-bit systems).

Only Commodore engineers made it possible...
Quote:
Hombre was going to be HP PA-RISC. Yeah, the writing was on the wall for 68K. They weren't looking at PPC.

Irrelevant. Motorola already decided to kill its beautiful baby, so the future was already set in the stone: no 68k for new Amiga systems.
Quote:
So the AAA Amiga wasn't going to be Amiga-compatible.

AAA was already Amiga-compatible. When do you think to read some documentation once in your life?
Quote:
Was it even gonna be called an Amiga?

Yes.
Quote:
Might as well have called it a C5000. C for Commodore.

Nope. C stands also for Crap, like the Cxxx stuff from Commodore. Better to avoid it.
Quote:
If Commodore had gone the ARM-like route in making their own 32bit successor to the 65816, they could have actually been ARM.

LOL. Again wishful dreaming of a blind 65xx fanatical.

As I've said, a successor of any 65xx processor would have been still a CRAPPY processor, NOT suitable for modern computation needs.

You don't know of what you talk about, because you've absolutely no idea of what's computer science and how the needs evolved during the time.

You're here writing BS only because you've a keyboard at your hands, INCOMPETENT!
Quote:
Remember, the ACORN's BBC Micro? They only developed ARM because Bill Mensch was lacking/lagging. Commodore could/should have been ARM.

Same as above. So, no way: Commodore has a great history of ineptitude at developing good stuff, with the exception of the VIC20 and C64.

In fact, once the engineers of those project left the company (and the same for the original Amiga team), we've seen how many stupid decisions and projects they have worked on and developed.

There's a reason why Commodore knocked at HP's door for the future/istic projects: a reason you haven't got and that you'll never get because of your pure incompetence in the field.
Quote:

Lou wrote:
3 people created the ARM cpu.

3.

I'm pretty sure Commodore had more than 3 engineers.

You don't need 3 engineers: you need GOOD engineers.

Even ONE, single, good engineer can make the difference!

Which was NOT the case for the Commodore engineers involved on designing processors. As we've clearly seen.

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 20-Apr-2025 10:37:59
#123 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@Lou

Quote:

Lou wrote:
@matthey

I find you're post mostly ridiculous. I answer your questions as to where the problem was: Bill Mensch and HIS manufacturers.
The solution: other manufacturers who used their own masks.

So Commodore was making better "6502's" than Mensch

WHEN?!? WHICH products?!?

If you talk about the 65EC02, well IT WAS VERY VERY LATE!
Quote:
and so was basically everyone else.

How does the TG-16 release a 7.16 65C02 in 1987? Did they use a WDC chip? No - they used a CMOS version manufactured by SEIKO and NEC.

And guess what: they required much more expensive DRAMs to reach those clock speeds, as Matt correctly reported.
Quote:
Their SiP contained their custom mmu for 2MB of addressable space. How much of a stretch would it be to make a fast 65816?

Ask Nintendo, which decided to use a much cheaper and with an 8-bit data bus for its SNES.

Anyway, and sure: it was possible to make faster 65816... requiring much faster DRAMs -> much more expensive.

The 65xx family was good when memory had much higher clocks compared to CPUs, because the CPU was able to issue a memory request each cycle, so fully utilizing (I'm simplifying here for YOUR convenience) the memory bandwidth.
However, this game stopped once the CPUs scaled much better than memories at the clock speed. That was a killer for processors like that, that relied on a single-cycle access. Which, not by the case, was the reason why RISC processors utterly failed and... had to become CISC processors, to survive.

Unfortunately (!), 65xx were not able to survive for a simple reason: once you started adding TONs of transistors only for adding code and/or data cache(s), having a core using a very limited number of transistors was NOT a great advantage anymore. That's why they remained more or less the same: just small core processors directly interfaced with (fast) memory. And only good at microcontrollers, of course.
Quote:
That wasn't their goal/need and would have potentially negated the need for the MMU since it can address 16MB natively.

See above me previous post about this: even with such address range capability, the mechanism for using it was VERY CRAP. Way much worse than even the obsolete 8086.
Quote:
Other manufacturers were making 65816's without the bugs of the WDC version

Guess what: bugs can be fixed...
Quote:
and they were faster.

Up to what?
Quote:
FYI the 68000 is buggy, hence the 68010.

Absolutely no. The 68010 was there for OTHER reasons.

You never opened the programmer's manual for any of them, right?
Quote:
The ARM emulator was emulating a 68000 not a 68030. So, was a 68030 affordable in 1987? It was barely affordable in 1990. ARM2 in 1987 was 12MhZ and was 7x better than the A500/2000's lame 7.14MhZ 68000. You really should get your #'s straight.

Irrelevant + Red Herring: the 68020 and 68030 were already available and with high clock rates (especially the second) and with good performance.

Their price was NOT the argument here.
Quote:
A500 was just an A1000CR. A600 was basically a CDTV-CR. It's ridiculous that Amiga kept using a 68000 in so many products for so long.

It was in good company, as the 68000 dominated in arcade systems, replacing virtually all 8-bit processors: who knows why?
Quote:
"For Compatibility" - really? Once the C128 was released, many publishers RE-released C64/128 versions of their software because if your software poked the 2mhZ register in 64mode, you'd lose the VIC display.

Many? How many? Do you've a list of those?

Because I recall only A FEW software which were available for the C128. I've used mine practically always in GO64 mode...
Quote:
"Compatibility" was a lame excuse.

If it wasn't, then why adding a GO64 mode to the C128?

Do you that the C128 was made C64-compatible (almost all) ON PURPOSE, because of the backward-compatibility was VERY IMPORTANT?
Quote:
That's a software developer/publisher problem - not a hardware manufacturer.

True, but if the vendor offers a hardware which does NOT give ENOUGH ADDED VALUE (e.g.: new features), then why I, as a software developer, should invest (a lot of) money just for supporting this slightly different computer?

Again, you talk WITHOUT knowing how the things worked out at real, serious, software development. Because you're a complete incompetent.
Quote:
Re:compatibility of the AAA Amiga
the developers said it themselves - they'd have an Amiga on a card. So just like running a PC on a card and that's how it would have been "compatible". AA, aka AGA was just a tweak to ECS. AAA is nothing like AGA. AGA Amigas were still gimped by a 3.57MhZ bus, just wider. That's why the new architecture was needed.

AAA was great advance, but it was too late AND it carried to many things (even very stupid things).

The problem was always the same: Commodore lacked GOOD engineers, and the ones which were left had no competence neither vision to correctly evolve the platform.
Quote:

Lou wrote:
@Kronos

Quote:

Kronos wrote:
@codis

Yes and no. The OCS was designed that way because the 68000 had that quirk where it could only use 50% of the memory bandwidth in a best case scenario.

The split between odd and even cycles meant that the 68000 would only loose when executing an opcode in an odd number of cycles. Or if you pushed the chipset beyond taking just half the cycles.

So in a usual 4 color WB running an application you'd see almost no downside while a game developer would need to balance his code for either GFX or CPU.

That all went away with faster CPU and got completely perverted with AGA where the memory was clocked at half the CPUs speeds meaning the CPU only had a 25% chance to get that RAM access without some waitstates thrown in.

Wait states were somewhat masked by the fact that the fastest instruction executed in 4 clocks. If you used the 'fancy' addressing modes, not you're up to 8+(and much higher) clocks making your 7.14MhZ cpu no faster than a 1MhZ 6502.

LOL. Your crappy 6502 can only dream about such nice 68000 addressing modes!

How many instructions AND clock cycles it required to emulate them? Care to show some number, King of the Incompetence!
Quote:
Go look at how many clocks it takes to use MUL and DIV, LOL!

ROFL.

Sure, and please tell me: how many clock it takes for 16x16->32-bit MUL and a 32/16->16:16 DIV on your crappy 6502?

Preparing the pop corns...
Quote:
The blitter fools you into thinking the cpu was great.

The CPU was GREAT regardless of the Blitter.

In fact, it wasn't a care that it was used on WORKSTATION.

Whereas your toy was used for... what?
Quote:
Adding an REU to a C64/128 gives it a blitter too.

After the incompetence comes the ignorance: an REU does NOT give a Blitter, but simply a DMA channel!

A Blitter need to use a DMA channel, but it is NOT a DMA channel!

You don't know if the basics, IGNORANT!
Quote:
That's why this is possible: https://www.youtube.com/watch?v=UPlGtSpSt3w

Good, and? You need an EXTERNAL component which embeds a custom DMA controller to do it.

Try to do it with just a C64 or a C128 (even expanded to 256KB)!

You're an hopeless looser which is only capable of wishful dreaming about your CRAPPY architecture!

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 20-Apr-2025 10:43:18
#124 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@codis

Quote:

codis wrote:
@Lou

Quote:

Wait states were somewhat masked by the fact that the fastest instruction executed in 4 clocks. If you used the 'fancy' addressing modes, not you're up to 8+(and much higher) clocks making your 7.14MhZ cpu no faster than a 1MhZ 6502. Go look at how many clocks it takes to use MUL and DIV, LOL!

This is an area where I have just very superficial knowledge.
A fancy (complex) instruction set can map relatively complex HL constructs to very few machine instructions. That looks great on paper, but how many cycles those instructions require, especially for memory accesses, is a different question.

Correct. And processor which allows to do such complex computations thanks to its instructions and/or addressing mode is greatly recommended, because those are the needs for modern computing challenges.

That's why even on the video shared before by such incompetent, the speaker was clearly saying that such 65xx processors were NOT suitable for more complex tasks, like code written in C.

The problem is that he doesn't understand even the basic of computing, and not even understand the content of the videos which he shares. Hopeless ignorant...
Quote:
This is were RISC's load-store architecture made its mark.

Well, RISCs... were OK... ON PAPER: The final RISCs vs. CISCs

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 20-Apr-2025 11:16:20
#125 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@Kronos

When the 68k line was declared EOL PPC was a viable solution that just made perfect sense with the information available at that time.[/quote]
Not exactly. 68k were EoL only because Motorola decided to kill it and go to PowerPC.

So, PowerPCs were the cause of 68k being EoL, and not their viable solution: they were the replacement.

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 20-Apr-2025 11:27:11
#126 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@matthey

Quote:

matthey wrote:
Most early RISC architectures used 32-bit fixed length encodings making all instructions 32-bit and practically requiring more expensive 32-bit memory to maintain performance.

Fetching instructions from memory
ISA | Instruction size | 8-bit memory | 16-bit memory | 32-bit memory

ARM 32-bit 4 cycles 2 cycles 1 cycle
Thumb 16-bit 2 cycles 1 cycle 1 cycle

Just one thing: Thumb has a few 32-bit instructions (for long branches, and to to switch to/from ARM mode).
Quote:
Early ARM instruction cycle latencies were not all single cycle either.

ALU 1 +1 if you use a register-specified shift Rs, +2 if Rd is pc.
B, BL, BX 3
CDP 1+B
LDC 1+B +N
LDR/B/H/SB/SH 3 +2 if Rd is pc.
LDM 2+N +2 if pc is in the register list.
MCR 2+B
MLA 2-5
MRC 3+B
MRS, MSR 1
MUL 2-5
STC 1+B +N
STR/B/H 2
STM 1+N
SWI, trap 3
SWP/B 4

Instruction latencies were generally better than the 68020 and 68030 but ARM timings in memory suffered more from early typical cheap memory. ARM instruction latencies did not improve much from increased pipeline lengths where 68k instruction latencies improved with increased pipeline depth eventually surpassing most of these timings with the 68060. The RISC major advantage is simpler and easier to design cores but the major disadvantage is the memory bottleneck which still exists today.

The "funny" thing is that the first ARM processors were... FULLY MICROCODED!

"RISC"? Really?
Quote:
The original ARM architecture had other issues and inefficiencies.

o early 26-bit address space hardware limitation

That's the most important mistake that they did.

It's unbelievable that they decided to cripple their architecture by binding it to max 64MB address space on... 1985!

And that's only to save some gates (or, likely, to avoid using another GP registers from the 16) by embedding the status flags to the PC.

A crappy solution which they had to change very quickly, after a few year.

That's a great example of an architecture which was Bad-By-Design.
Quote:
o PC in GP register file and LR only leave 12-13 GP registers which is low for a RISC architecture
o poor code density despite only encoding 16 registers
o shift is not as cheap as sign and zero extension for decompressing immediates
o 32-bit fixed length instructions and poor code density was bad for embedded use
o Only 12-13 GP register performance handicap and 26-bit address space was bad for desktop and workstation use

Which is the reason why it gained very little market, at the time.

 Status: Offline
Profile     Report this post  
Kronos 
Re: Commodore > Motorola
Posted on 20-Apr-2025 12:04:43
#127 ]
Elite Member
Joined: 8-Mar-2003
Posts: 2749
From: Unknown

@cdimauro

Quote:

cdimauro wrote:

Not exactly. 68k were EoL only because Motorola decided to kill it and go to PowerPC.

So, PowerPCs were the cause of 68k being EoL, and not their viable solution: they were the replacement.


Why and how the 68k was EOL is irrelevant as to whether PPC was a viable solution for future Amigas.

Sure Motorola wanted everybody to switch to PPC but as far as it concerned Amiga (late C=, ESCOM and GateWay) and Phase5 it was one option of many.
X86, ARM or something a bit more obscure could have been made to work, some of them might seem like a better choice looking back, but back then PPC was (just) the most logical choice.

_________________
- We don't need good ideas, we haven't run out on bad ones yet
- blame Canada

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 20-Apr-2025 14:50:07
#128 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@Kronos

Quote:

Kronos wrote:
@cdimauro

Quote:

cdimauro wrote:

Not exactly. 68k were EoL only because Motorola decided to kill it and go to PowerPC.

So, PowerPCs were the cause of 68k being EoL, and not their viable solution: they were the replacement.


Why and how the 68k was EOL is irrelevant as to whether PPC was a viable solution for future Amigas.

Of course. Like others, even for different platforms: since there's no 68k anymore, companies should take a look at something else.
Quote:
Sure Motorola wanted everybody to switch to PPC but as far as it concerned Amiga (late C=, ESCOM and GateWay) and Phase5 it was one option of many.

Correct. There were announcements for different hardware platforms, and software platforms (Windows NT).
Quote:
X86, ARM or something a bit more obscure could have been made to work, some of them might seem like a better choice looking back, but back then PPC was (just) the most logical choice.

It depends on the specific context / project and time frame that you are referring to.

Because (and already) by end of 90s, PowerPCs weren't competitive anymore.

In fact, on 2000 even Apple was planning to abandon PowerPCs on move to Intel. The decision was stopped only because a IBM manager offered the G5 (to be developed) to Jobs. But, as we know, it was just a stopgap and Apple switched to Intel after a few years.

 Status: Offline
Profile     Report this post  
matthey 
Re: Commodore > Motorola
Posted on 20-Apr-2025 19:42:27
#129 ]
Elite Member
Joined: 14-Mar-2007
Posts: 2624
From: Kansas

cdimauro Quote:

Just one thing: Thumb has a few 32-bit instructions (for long branches, and to to switch to/from ARM mode).


Most Thumb instructions are 16-bit like the 68k. My point was that without caches the instruction has to be fetched first before execution and with small instruction fetches, narrow width DRAM and DRAM wait states that were common in the early days of MPUs, this memory bottleneck made RISC less appealing than CISC. ARM likely would have been more popular had it started with a 16-bit fixed length or variable length encoding rather than a 32-bit fixed length encoding. SPARC and MIPS were direct decedents of Berkeley and Stanford RISC projects while the ARM architecture was a new design that aimed lower.

The ARM RISC Chip: A Programmer's Guide
https://arcarc.nl/archive/Books/The%20ARM%20RISC%20Chip%20-%20A%20Programmer%27s%20Guide%20-%20Alex%20Van%20Someren%20&%20Carol%20Atack/The%20ARM%20RISC%20Chip%20-%20A%20Programmer%27s%20Guide%20-%20Alex%20Van%20Someren%20&%20Carol%20Atack.pdf Quote:

Genesis of ARM in comparison with other RISC processors

In fact, many of the commercially available RISC processors intended for use as the CPU of a personal computer or workstation were designed or developed in-house by system developers, when microprocessor developers were either concentrating on improving their CISC designs or designing RISC chips for supporting roles or as embedded controllers.

For example, Sun developed the SPARC RISC chip and architecture for its own computer workstations, while notable RISC processors from established chip producers include Intel's i860 graphics processor and AMD's 29000, which has mainly been used as a graphics accelerator or in printers. However, both Sun's and MIPS' efforts were based on earlier research efforts at Stanford and Berkeley universities respectively, while Acorn's project was effectively begun from scratch, although reports on the Berkeley and Stanford research were read by the Acorn team and were part of the inspiration behind designing a RISC processor.

One of the reasons the ARM was designed as a small-scale processor was that the resources to design it were not sufficient to allow the creation of a large and complex device. While this is now presented as (and genuinely is) a technical plus for the ARM processor core, it began as a necessity for a processor designed by a team of talented but inexperienced designers (outside of university projects, most team members were programmers and board-level circuit designers) using new tools, some of which were far from state-of-the-art. With these restrictions on design and testing, it is hardly a surprise that a small device was developed.

While the ARM was developed as a custom device for a highly specific purpose, the team designing it felt that the best way to produce a good custom chip was to produce a chip with good all-round performance.

1.2.3 Designing the first ARM

Work on the development of what was to become the ARM began in 1983. Working samples were received in 1985. The team developing it included Steve Furber, now ICL Professor of Computer Engineering at Manchester University, and Roger Wilson, both of whom had worked on the design of the BBC Micro, as well as Robert Heaton who led the VLSI design group within Acorn.

The design team worked in secret to create a chip which met their requirements. As described earlier, these were for a processor which retained the ethos of the 6502 but in a 32-bit RISC environment, and implemented this in a small device which it would be possible to design and test easily, and to fabricate cheaply.

First the instruction set was specified by Wilson, based on his knowledge gained as the author of much of the original software for the BBC Micro, including its BASIC interpreter. The important initial decisions were to use a fixed instruction length and a load/ store model. Other design decisions were taken on an instruction by instruction basis.

...

1.4.1 High performance for low price

The original ARM1 device was intended to power an Acorn computer, a personal computer rather than the workstations which other RISC processors such as the MIPS and the SPARC were designed for. Rather than use the advantages of RISC to make a large chip, more powerful than its CISC equivalent, the Acorn chip used RISC techniques to make a smaller chip of equivalent power to those used in other personal computers.

The ARM processor has always differed from other commercially available RISC processors in that it is intended to meet a price/performance ratio rather than to be the most powerful processor available. Acorn's computers have always been aimed at the middle of the market, so the processor designed to power them was too. ARM processors are not the most powerful, but offer an extremely good price/performance ratio compared to other processors, at about a dollar per million instructions per second (MIPS) in the case of ARM6.


With the small ARM core size, ARM could have integrated more logic into the MPU rather than using 3 support chips. Along with the introduction of the Thumb ISA, more integration was one of the reasons why ARM started to see success in the embedded market.

Quote:

1.4.4 Easily customized designs

The above factors combine to make the ARM product range extremely flexible. The small size of the ARM processor means that it can easily be combined with its support chips, cache memory, or custom circuitry to make self-contained custom chips. All ARM devices are designed as macrocells, building blocks which can be combined within a single chip.

The ARM610, commissioned by Apple, is one example based on macrocells, which includes the 32-bit ARM6 processor core, a 4 kbyte cache, a write buffer and a memory management unit. Even with all these additional components, the end result is a much smaller package than familiar processors such as the 80386.

Acorn Computers has also enjoyed the fruits of commissioning a custom chip from ARM which effectively combined the original ARM2 four chip set on to a single device, the ARM250. This process was carried out from the original concept to volume production in 12 months, resulting in a single device with a sixth of the footprint, one third the power consumption and half the cost of the devices it replaced.


The ARM250 which integrated the 4 chip set, created a single device with a sixth of the footprint, one third the power consumption and half the cost of the devices it replaced. That is a game changer and should have been done earlier just like the 68k Amiga SoC that Commodore planned, delayed and failed to deliver before going bankrupt.

Another important advancement from ARM for embedded use was the move to a fully static CMOS design.

Quote:

ARM becomes the Advanced RISC Machine

By 1990 it was clear that although Acorn's financial position had stabilized, an in-house processor design team was an expensive luxury for a small company to support. The ARM development team had now produced a static version of the processor, the ARM2aS, making it even more attractive to potential third-party customers. This new variant added low power consumption to the list of features which made the ARM attractive to developers interested in designing low-cost portable and hand-held devices and electronic personal organizers. It was intended for inclusion in a hand-held personal electronic organizer and communications device, which although developed as far as working prototypes was never actually marketed (the Active Book).

Interest in the ARM family was growing as more designers became interested in RISC, and the ARM's design was seen to match a definite need for high-performance, low power consumption, low-cost RISC processors. In conditions of greatest secrecy an agreement was reached between Acorn, VLSI Technology Inc. and a company which had expressed an interest in the ARM for some time now, Apple.


I have talked about the importance of fully static CMOS designs before which allow the clock frequency to vary between zero and the max. The 68040V and 68060 were early 68k designs that were fully static CMOS designs. Motorola developed and produced fully static CMOS 68000 CPUs which were likely also used as modular cores in early 68k SoCs. The 68060 was also designed to be modular for SoCs. It was ARM which created the AMBA bus standard which even Motorola use later for ColdFire.

https://en.wikipedia.org/wiki/Advanced_Microcontroller_Bus_Architecture

ARM made plenty of mistakes early but for a new team with a new design they were not too bad. They persistently improved and survived in a very competitive environment. They were still only #4 in the crowded 32-bit embedded market in 1997 (ahead of #7 PPC) and I can not even name the #4 player today. The competition and innovation has all disappeared today.

cdimauro Quote:

The "funny" thing is that the first ARM processors were... FULLY MICROCODED!

"RISC"? Really?


I would say ARM was minimally microcoded.

https://www.righto.com/2016/02/reverse-engineering-arm1-processors.html

If there was a firm definition for fully microcoded I would say it is the 68000. Supposedly, the 68000 allowed to change the ISA. A System/370 ISA supposedly could be microcoded instead of 68000 ISA.

https://thechipletter.substack.com/p/motorola-intel-ibm-make-a-mainframe

Perhaps the ability to fully change the ISA should be the definition of fully microcoded?

Last edited by matthey on 20-Apr-2025 at 07:45 PM.

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 21-Apr-2025 5:20:22
#130 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@matthey

Quote:

matthey wrote:
[...]
The ARM250 which integrated the 4 chip set, created a single device with a sixth of the footprint, one third the power consumption and half the cost of the devices it replaced. That is a game changer and should have been done earlier just like the 68k Amiga SoC that Commodore planned, delayed and failed to deliver before going bankrupt.

You need GOOD engineers for that, which Commodore lacked. They were not even able to just merge Agnus and Paula in a single chip...
Quote:
Another important advancement from ARM for embedded use was the move to a fully static CMOS design.

Quote:

ARM becomes the Advanced RISC Machine

By 1990 it was clear that although Acorn's financial position had stabilized, an in-house processor design team was an expensive luxury for a small company to support. The ARM development team had now produced a static version of the processor, the ARM2aS, making it even more attractive to potential third-party customers. This new variant added low power consumption to the list of features which made the ARM attractive to developers interested in designing low-cost portable and hand-held devices and electronic personal organizers. It was intended for inclusion in a hand-held personal electronic organizer and communications device, which although developed as far as working prototypes was never actually marketed (the Active Book).

Interest in the ARM family was growing as more designers became interested in RISC, and the ARM's design was seen to match a definite need for high-performance, low power consumption, low-cost RISC processors. In conditions of greatest secrecy an agreement was reached between Acorn, VLSI Technology Inc. and a company which had expressed an interest in the ARM for some time now, Apple.


I have talked about the importance of fully static CMOS designs before which allow the clock frequency to vary between zero and the max. The 68040V and 68060 were early 68k designs that were fully static CMOS designs. Motorola developed and produced fully static CMOS 68000 CPUs which were likely also used as modular cores in early 68k SoCs.

That's very important for the embedded market, but less appealing for the desktop market. Laptop is another market segment were it's important to dynamically clock a processor down low frequencies, but it was a niche market at the time (nowadays is the most important).
Quote:
The 68060 was also designed to be modular for SoCs. It was ARM which created the AMBA bus standard which even Motorola use later for ColdFire.

https://en.wikipedia.org/wiki/Advanced_Microcontroller_Bus_Architecture

ARM made plenty of mistakes early but for a new team with a new design they were not too bad.

I wasn't aware of this. They did a great work, but it's for the embedded market, as we can see. ARM decided to go on this direction after the failure with the Archimedes line, as we know. And it was the way for its success.
Quote:
They persistently improved and survived in a very competitive environment. They were still only #4 in the crowded 32-bit embedded market in 1997 (ahead of #7 PPC) and I can not even name the #4 player today. The competition and innovation has all disappeared today.

There's innovation, even more than the past, because it's easy to experiment and design new things.

But the problem stays on going to the market and acquiring a segment. There's a very high barrier which prevents innovative products to reach that goal.
Quote:
cdimauro Quote:

The "funny" thing is that the first ARM processors were... FULLY MICROCODED!

"RISC"? Really?


I would say ARM was minimally microcoded.

https://www.righto.com/2016/02/reverse-engineering-arm1-processors.html

Yes, I know it, but this shows that the ARM is fully microcoded.

There's little microcode logic, yes, because the design is very simple. But everything is microcoded (see below).
Quote:
If there was a firm definition for fully microcoded I would say it is the 68000. Supposedly, the 68000 allowed to change the ISA. A System/370 ISA supposedly could be microcoded instead of 68000 ISA.

https://thechipletter.substack.com/p/motorola-intel-ibm-make-a-mainframe

Perhaps the ability to fully change the ISA should be the definition of fully microcoded?

Fully microcoded is when each processor instructions is split into microinstructions which are the really ones executed, with a sequencer which is fully controlling this process. That's the reason why ARM is fully microcoded.

PC XT/370 is the most impressive design which I ever seen. Unbelievable what they were able to do. Kudos to the architects!

However, this was possible because 68000 and System/370 had A LOT of similarities (big endian, 16/32/48-bit opcodes, 16 registers). You can't do the same trick by using 68000's microcode to emulate an 8086.

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 21-Apr-2025 6:06:51
#131 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@Hammer

Quote:

Hammer wrote:
@cdimauro

Quote:
What's not clear to you about this:

you're not able to understand that the install base was totally irrelevant for the new machines, like the Amiga, which quickly received support from the software houses with software of any kind.

?

Which is THE CONTEXT of this part of discussion.

And this even for the A1000, which sold only a few thousand. Care to explain how it was possible?

However, the problem is that bots do NOT understand, have NO memory, NO logic, hence can't get context neither know the history.

As I've said, you are (also) an alien, because you have no clue, at all, of the Amiga history and what happened to our (excluded you, of course) beloved platform.

Your "easier to program with 68K" argument is useless when there are other fuckups with 68K Amiga platform e.g. very low install base with high resolution display.

Irrelevant + Red Herring. You're lacking simple, elementary, logic: the fact that the install base is low and there were not stable high resolution (because that's what was missing: there was high resolution! But interlaced) does NOT imply that Amiga could have not gained professional software.

In fact, history proves the exact opposite: Amiga had A LOT of professional software, and even became THE market of reference in certain areas.

It's enough to take a look at the software which was developed for / during the Amiga 1000, to clearly see this. The A1000 had a miserable install base, yet it received A LOT of support (NOT only games!). That's, again, HISTORY!

Last but not really least, the 68000 was very important for this, because it allowed to quickly write code. That's was very important especially for the professional software, because the processor offered a fully linear address space, 16 registers, and more complex address modes which WAY BETTER fit for the needs in this case.

You don't know this and, most important, you can't appreciate it, because you never developed software in 68k and/or x86 assembly. I did it, I've spent YEARS on that, and I'm fully aware of it, so I can talk. You, instead, continue to talk of things where you no clue, at all!

BTW, the Amiga OS helped a lot as well, because it offered a consolidate set of API & libraries which can be used out-of-the-box to create sophisticated GUIs. Spot the difference with the DOS applications, which had NOTHING like this, that you've cited...

Again, you've no idea of how coding was done at the time. You talk only because you've a keyboard in your hands.
Quote:
From ground zero, only 68K Macintosh was able to establish a large enough business customer base that could spend 1 million PowerMac unit sales from March 1994 to January 1995.

The discussion was about the early days of the Amiga (1985 and so on), and guess what: you talk about when Commodore was already in bankrupt.

Of topic? No: it's "simply" Hammer's total non-sense and the complete inability to follow a discussion. Hopeless!
Quote:
Despite 68000's early 32-bit programming model and Amiga's early multimedia technical lead, poor management proved to be the real boat anchor for both Motorola and Commodore.

Totally irrelevant + another Red Herring. You talk of things which totally disconnected.

As usual, because you don't understand the context and randomly put together things which have no relation ship.

Again, hopeless...
Quote:
Atari ST's annual unit sales growth stalled after 1987. Other GUI platform competitors exceeded Atari ST/Mega ST's monochrome high-resolution offering.

Nevertheless, Atari gained some professional market: music/MIDI and desktop publishing.
Quote:
For the US market after 1986, Amiga's stable 600x200p NTSC resolution is not enough for a mainstream business platform.

Nevertheless and despite this, A LOT of professional software was developed for the Amiga. How it was possible? It's a mistery, but only for Hammer, which completely ignored the Amiga history...
Quote:
Where's your "easier to program with 68K" argument?

See above for that.

Plus, another Red Herring: the two things are NOT related.
Quote:
Have you realized your "easier to program with 68K" argument is useless?

The only thing that is every easy to realize is your complete lack of contextualization, since you talk of two completely different things, which you decided to put together for unknown reasons.

Logic: not found on Hammer's brain...
Quote:
You can't focus on just the CPU when the platform is the entire desktop computer solution.

In fact, I haven't done it. The 68k was a GREAT contributor when talking about writing software, but it wasn't the only factor here.

That's something that you've realized, but without any base, since I've said nothing like that.

You completely invent things because your brain is not able to understand what people write and starts derailing with hallucinations.
Quote:
Your cited 6888x 3D application's 1988 release is late!

Sure, sure. The Amiga was THE machine which brought the 3D to the consumer and professional market, but 1998 was.. late.

Only for Hammer, because the reality is completely different. Your brain is affected by heavy hallucinations coming from the parallel universe where it's living...
Quote:
Autodesk developed Autoshade in 1987 for AutoCAD. Autodesk 3D Studio later displaced Autoshade.

Totally irrelevant? Have you checked how many 3D applications where developed for the Amiga? Clearly not.

Even after Commodore went bankrupt, they were released. For example, Real 3D 2.0 with the inverse cinematic implemented (and Forth-based), was developed on 1995.

How was it possible? Another mystery, but only for Hammer's brain which is lost in its parallel universe...
Quote:
My point, PC's X87 market drivers are larger than Amiga's 6888x!

Oh, really? I wasn't aware of it: thanks for recalling it, Messier de La Palice!

 Status: Offline
Profile     Report this post  
cdimauro 
Re: Commodore > Motorola
Posted on 21-Apr-2025 6:46:59
#132 ]
Elite Member
Joined: 29-Oct-2012
Posts: 4335
From: Germany

@Hammer

Quote:

Hammer wrote:
@cdimauro

Quote:

And what's problem with that?

68k platforms received A LOT of support, despite the very limited numbers (compared to PCs).

Without a timeline, that's meaningless fluff.

The timeline was already reported on the discussion: if you're not able to follow it, then it's YOUR problem!

BTW, you are the last one which could ask other people about a timeline, since you continuously mix totally different period of times!
Quote:
Quote:

Even FPU support, as I've proved.

The 1988 release is already late.

ROFL. You don't know of what you talk about! Late for what?!? Care to PROVE IT? Because the Amiga history has shown a completely different picture, with a lot of professional software developed, and some of it which received FPU support.
Quote:
The FPU is not the only factor in a desktop platform.

Then what's YOUR problem?!?
Quote:
For the 1985 to 1988 context, Lotus 123's back office market is larger than the 3D application market, e.g. Lotus is larger than AutoDesk in unit sales and revenue.

And? WHO CARES! This has NOT preventing professional software to be developed for OTHER markets and OTHER platforms.

Besides that, your statement is a total non-sense. So, since the back office market was the larger one, what should have done the software houses? Only develop such kind of software, leaving out EVERYTHING ELSE?!?

You're completely detached from the reality!
Quote:
Quote:

Do you recall WHY Kickstart 1.2 was mentioned, WHO mentioned it, and the real situation? No, because you completely lost the context... as usual.

Your cited 1988-released 3D software didn't work for KickStart 1.1/Workbench 1.1.

Sure, because it required the Amiga OS 1.2 which... rolling drum... was already available on 1988 for ALL Amiga platforms.
Quote:
A2000-B was released in March 1987 with Kickstart 1.2.

Thanks for confirming it.
Quote:
Quote:

Yes, it was available one year after that the A1000 was first sold.

From September 1986 and beyond. The early Kickstart 1.2 version didn't have the V1.2 marking.

There was a Kickstart 1.2 delay rollout due to unhappy engineers.

And? Who cares. The important thing is that Kickstart 1.2 was available on... rolling drum... 1988!
Quote:
Amiga 1000's production was terminated in early 1987.
From "Commodoree - The Final Years"

The show marked the official release of the Sidecar, the IBM PC
emulator for the Amiga 1000 announced so long ago. Demo versions
of the units had been available to Amiga dealers as early as
1986, but the product had entered limbo as engineers worked out
the bugs.[3] In December 1986, the hardware finally passed FCC
regulations and production began in January 1987.

Unfortunately, the software still had several known bugs. Dealers
began selling the units in February using pre-production software,
with the promise of a disk upgrade later in the year.
Although Commodore previously announced it would market the
Sidecar for “significantly below $1000” it was now put on sale for a
suggested retail price of $999. The Sidecar began appearing en
masse in retail stores later in June.

In one of its legendary marketing fiascos, the important device
appeared just in time for Commodore to phase out the A1000,
on
which it was dependent. Magazines speculated that, with
Commodore moving onto the Amiga 2000, they did not want to sell
too many Sidecars. The belated release was primarily meant to avoid
false advertising lawsuits.

Totally irrelevant.
Quote:
A1000 is being phased out in early 1987.

And? Was the Amiga OS 1.2 (and only 1.3!) developed and available for it? Yes or not!
Quote:
Quote:

And? What's the problem with that? The question remains the same: did the Amiga get support from the software houses or not?

Have you realized that September 1986 to your cited 3D software example's 1988 release year has damaged your "easy to program with 68K" argument?

No, because I, unlike you, can contextualise and stay grounded in reality.

What's not clear to you that the FIRST release of Sculpt 3D was on 1987? What's not clear to you that after just ONE YEAR it already got a new version with the FPU?

And we're talking about a 3D ray tracing applications: one of the most complicated piece of software.

BTW, I doubt that it was fully coded in 68k assembly. Very likely it's C code, because of great complexity of the software.

What would have helped here, A LOT, is the Amiga OS API / libraries which allowed to very quickly write the GUI for the software.
Quote:
Your "easy to program with 68K" argument is useless.

See above. And BTW, at least I know very well how to program a 68k and x86 as well, whereas you've no clue, at all, since you're just searching the web trying to grasp some information.
Quote:

The fact that I've cited the FPU support for 1988 was just what it is (!): a testament from the fact that the Amiga received support for professional software DESPITE THE INSTALL BASE (which had ZERO FPUs from the available machines).

1. The missing factor is Amiga ECS's 640x480p productivity mode being missing in action during 1987 Windows 2.x+VGA and Macintosh II's release window.[/quote]
Irrelevant: see above, an my previous comments (professional graphic cards were developed for the Amiga. Which had a FLORID market, from this PoV).
Quote:
FPU alone doesn't complete the GUI desktop platform!

The FPU was NOT required for the Amiga GUI and related applications.

Again, you don't know of what you talk about, because you never coded a single Amiga (but very likely even DOS) application in your life!
Quote:
Unlike the mainstream PCs, the mainstream Amiga doesn't include an FPU socket.

Totally irrelevant. In fact, this did NOT prevent the Amiga to have software using the FPU.

How was it possible? Simple. It's very well known (!) that Amiga developers used magic wands to create such software. Since there Amiga machines had no FPU socket, then there was no other way, right? Right, Hammer-that-talks-of-things-that-have-no-clue-at-all?
Quote:
1987's Mac II with 256KB VRAM and PC VGA support 640x480p 16 colors for graphics business markets.

Same as above: irrelevant.
Quote:
Mac II's color 640x480p resolution, and color Quick Draw ecosystem set the Mac LC from October 1990) and LC II (from March 1992) sales boom for the Mac platform.

And again completely derailing moving ahead the time frame of the discussion. Usual Hammer's total non-sense...
Quote:
Amiga engineers have demonstrated ECS's 640x480p productivity mode in Q4 1988 with A2000, while management has other ideas e.g. timed exclusive ECS for A3000 (from June 1990).

Amiga engineers? Sure, the ones which were not even able to add TWO miserable bitplane pointers to the chipset, which already had space reserved for that.

The same Amiga engineers which knocked to the LSI team to get help on such productivity (and SHRED) mode, because they were not able to do achieve the goal.

The same Amiga engineers which which ask the LSI team even to... write the SPECs (PAPER work) for the new chipset. Paper work which required MORE THAN ONE YEAR (while the LSI team developed the first prototype of the C65 in more or less the same time frame. NOT paper work).

Thank you for reminding us what hands the development of the Amiga chipset was in...

Anyway, we're late in time here: the context of the discussion was about the early years of the Amiga, in this part. Forgot it? As usual...
Quote:
There is a story in the Commodore - The Final Years book about suppressing the AmigaOS 2.x upgrade for existing Amiga OCS. Management wanted AmigaOS 2.x/ECS to be exclusive for A3000, and Amigans must buy a fat profit margin A3000 for the upgraded experience.

Sure (and irrelevant: out of context). And tell me more about the A3000 case, which was INCOMPATIBLE with the cards produced for the Amiga 2000 (Videotoaster? "Irrelevant", right?): was it this a mistake of the management as well?

Or was it a mistake of the geniuses working on the engineering team?
Quote:
Mainstream press criticism against A3000's June 1990 release was the lack of a 256-color display and a 68040 CPU. Mainstream press doesn't give damn about 3rd parties.

Again, out of context, but see above: you've to tank the geniuses for that.
Quote:
Amiga engineers have designed a fully functional 68040-25 with an L2 cache accelerator card with the A3000, and marketing (management) rejected it. The downgrade A3640 card was Commodore's second attempt for the later A3000T/040, and it was approved by management.

When? Rolling drum... out of context as well, right?
Quote:
A3000plus with AGA and 68040+L2 cache in 1991 is superior when compared to Commodore management's stonewalling the Amiga while promoting Commodore's PC clone evolution improvements i.e. you want 256 colors from Commodore? Buy a Commodore PC with SVGA instead.

Again, totally out of context. Hopeless...
Quote:
--------------------

Install base's demographics matter for road map planning and reducing development risk for 3rd party developers. Note why Amiga is not a Mac.

See above: irrelevant.
Quote:
For the US market, Amiga OCS's 640x200 NTSC is frozen in time, and A2024's production delays and 5000 unit scale are a joke.

Same as above: irrelevant.

One day maybe you'll write down how it was possible that the Amiga received so much support for professional software, despite the lack of high-resolution + stable graphics.

Mistery of Hammer, which was living on a parallel universe...
Quote:
Quote:

Good. How saved Apple? Microsoft / Gates. As I've already reported. Thanks for confirming it!

You missed Steve Jobs's effort when He argued his case with Bill Gates. MS's support for the Mac wasn't automatic. Apple's superior leadership matters for this case.

What's not clear to you that this happened only because Bill Gates allowed it? Jobs had to knock at his door to find an agreement, which Gates agreed.

The superior leadership was the one of Gates, not Jobs: it was Gates that allowed it because had the power on his hands. Because it was Gates to put Apple on this condition, with the lawsuit which was taking a lot of money and time. Time where Apple lost a lot of market and it was about to go bankrupt, whereas Microsoft consolidated its (almost) monopoly.

In short, at the time Apple wasn't anymore a threat for Microsoft. PC had THE market and Microsoft with Windows and Office as well (Wintel: nomen omen).

So, closing the dispute was just another business opportunity for Gates.
Quote:
Quote:

The engineers of the Amiga team were so much incompetent that weren't able to define even the pure SPECs and LSI team had to jump in and help.

Pay attention the part that I've highlighted: the LSI team was the one which was able to understand the Amiga chipset (and to "transplant" part of it to the C65 chipset) and NOT the (new) Amiga team!

As you can see, you don't miss opportunities to bring fuel to me, and prove my reconstruction.

For the 1987 context, you miss is CSG LSI's inability to quickly re-engineer OCS Denise with partial AA Lisa features e.g. shared 4096 color palette. Commodore management's firing of Amiga's original graphics engineers has a brain drain cost.

At least the LSI team able to do something, whereas the Amiga team was not. At all!
Quote:
Henri Rubin initiated the monochrome ECS R&D direction, and the blame for this debacle is on management. Henri Rubin was replaced by Bill Sydnes. Both Bill Sydnes (A300/A600, A2200/A2400/A3200/A3400) and Henri Rubin (A3000) repeated the ECS mistake i.e. departed from ECS's original purpose.

And wasn't Rubin which wanted the meeting for the new Amiga chipset?
Quote:
After shutting down the original Los Gatos Amiga group, the second Amiga group was from the Commodore's system engineering group (e.g., C900, Unix clone), the original engineer for Paula, and a team from AmigaOS.

Commodore's major engineering groups are;
1. The system engineering group and later created the VLSI group during the AAA project. This group designed C900.
https://en.wikipedia.org/wiki/Commodore_900
Has cost reduced A1000 to A500 e.g., Gary chip, co-designed by an external outsourced team. Primary designers for Gayle, Fat Gary, Bridgette, Buster, Super Buster, Ramsey, DMac, SDMac, custom MMUs, A2620, A2630, A3640, early A3640 with L2 cache, A590 and, etc'.


2. Original Los Gatos Amiga group, designed Amiga ICS (missing 64 color EHB mode) and OCS. The original Los Gatos Amiga group quickly added 64-color EHB mode for PAL A1000 and later NTSC A1000. Cancelled Amiga Ranger has up to 128 color 7 bit planes and is the closest to AGA's 256 color 8 bit planes. Key engineers later designed 3DO (with 16-bit color and 24-bit color display, quadrilateral 3D with texture accelerator) and 3DO M2 (triangle 3D with texture accelerator). For the 1st 3DO, the 3DO team made the same mistake as Sega Saturn and NVIDIA NV1/NV2 quadrilateral 3D systems.


3. LSI engineering group designed VIC-20, C64, C128, C65 and AA Lisa. Participated in AAA Andrea's R&D. Designed CSG 65xx CPUs, CIA, and many other chips.
The LSI engineering group participated in turning the Amiga Lorraine breadboard into three main ASICs. Suffered a brain drain with the SID chip.


4. Commodore PC group (led by Jeff Frank, later moved into the Amiga group's leadership position in June 1991, replacing Jeff Porter).


5. Multimedia group (e.g. CDTV, CDTV-CR), Akiko's DMA CD-ROM controller, and hardware C2P. Dependent on the Amiga group's multimedia chipset R&D.
Jeff Porter was moved into this group in June 1991, and cared enough about the planar with chunky pixels issue.

Read the Commodore - The Final Years book.

Not needed: you reported sufficient stuff to understand how it was the situation.

At least, I'm able to understand. Not your case, of course.

 Status: Offline
Profile     Report this post  
Kronos 
Re: Commodore > Motorola
Posted on 21-Apr-2025 8:46:14
#133 ]
Elite Member
Joined: 8-Mar-2003
Posts: 2749
From: Unknown

@cdimauro

Quote:

cdimauro wrote:
One day maybe you'll write down how it was possible that the Amiga received so much support for professional software, despite the lack of high-resolution + stable graphics.


Amiga did receive "support for professional software" in areas where video compatibility and high colors were essential, everything else was spotty compared to PC, Mac or even the ST.

The A2024 or SuperHighRes Denise would only have helped if they had been available at launch (having that launch a bit earlier would have helped a lot) at reasonable prices. For SuperHighRes it would also have needed to have some sense pin on the video forcing WB into that mode at startup avoiding the need for multisync monitor if one wished to use it in just that mode.

As it was it was a computer for video and GFX steering into game console with the A500 release.

_________________
- We don't need good ideas, we haven't run out on bad ones yet
- blame Canada

 Status: Offline
Profile     Report this post  
coder76 
Re: Commodore > Motorola
Posted on 21-Apr-2025 11:39:28
#134 ]
Member
Joined: 20-Mar-2025
Posts: 19
From: Finland

About the Amiga custom chipset, I still think it's the right way to have this $dff000 address space for everything, including a 3D unit. Not a separate graphics card, like Nvidia in desktop computers, these 3D chips for mobile phones are also a lot smaller and don't require cooling. Here's e.g. this Maggie 3D unit for the Apollo SAGA:

https://m.youtube.com/watch?v=5rxggxxPzYg

I'm sure, performance-wise, it's at least a thousand times slower than modern 3D cards, but it is also more compact and minimalistic that suits the Amiga system better. And you don't need 3D driver software to use the 3D unit, just plug in some values in $dff000, and it will work.

 Status: Offline
Profile     Report this post  
IntuitionAmiga 
Re: Commodore > Motorola
Posted on 21-Apr-2025 13:31:50
#135 ]
Regular Member
Joined: 5-Sep-2013
Posts: 130
From: Unknown

https://youtu.be/njGWWg69B4A Nice MVG video about the 68000.

_________________

 Status: Offline
Profile     Report this post  
Lou 
Re: Commodore > Motorola
Posted on 21-Apr-2025 20:38:58
#136 ]
Elite Member
Joined: 2-Nov-2004
Posts: 4258
From: Rhode Island

Quote:

IntuitionAmiga wrote:
https://youtu.be/njGWWg69B4A Nice MVG video about the 68000.

Did you notice how he mentioned Motorola's 680X line being just 1 customer: GM?
I've said this before. But now that someone else has said it, perhaps now some numb-skulls on this forum will listen to someone who's actually worked for Motorola...specifically their ISG division.

Did you notice how powerful he said the 68000 is that arcade boards had to use 2 and/or 3 of them?
/facepalm
(while being clocked higher than computers as well...)

As we know, clock for clock - a 68000 can't even add 1+1 as fast as a 6502...meanwhile it also stores it to ram...so this test gimps the 6502 and it still wins...if the 68000 was forced to store the value back to RAM it wouldn't even be close...not that it was actually close...
https://www.youtube.com/watch?v=2k_jP73Ly7A
The comments are gold.

Atari Lynx designers, David Needle and R.J. Mical (maybe y'all have heard of them) chose the 6502 instead of 68000 for a couple of reasons.
https://forums.atariage.com/topic/249018-im-fine-with-the-lynxs-cpu-but-why-did-they-pick-a-6502-and-not-a-z80b/#findComment-3438437

Amiga used the 68000 from 1985 -> 1992 (A600). What a load of bull!
Thanks to Motorola, we didn't get A500/2000's with '020 base cpu in 1987.
Thanks to Motorola, we didn't get a CDTV with an '030 and who could afford an $3,379 A3000(030) in 1990? $3,699 for an A4000(040) in 1992? LOL!

Bill Mensch figured out Motorola sucked in the 70's. Hence the 6502.
Finally - during 'hombre' development they realized Motorola sucked... Cheaper and better options were available.

 Status: Offline
Profile     Report this post  
matthey 
Re: Commodore > Motorola
Posted on 21-Apr-2025 20:53:35
#137 ]
Elite Member
Joined: 14-Mar-2007
Posts: 2624
From: Kansas

IntuitionAmiga Quote:

https://youtu.be/njGWWg69B4A Nice MVG video about the 68000.


Another great video by Modern Vintage Gamer. It is only 8 hours old and has 54k views already showing just how popular the 68k remains 46 years after release of the 68000. I am unsure if the 68k Amiga is popular enough to mass produce the 68k again but I am more confident of universal 68k hardware.

The video has a strange and funny 68k coding example.

https://youtu.be/njGWWg69B4A?t=642 Quote:

1 int x = 2;
2 int y = 5;
3 int result = x + y;

compiled assembly language
move.l x,%d1
move.l y,%d0
add.l %d1,%d0
move.l %d0,result


The x and y are constants so a good compiler would generate the following code.

moveq #7,d0
move.l d0,result

Assuming x and y are not constants like his RISC style assembly code, CISC coding would give the following using one less instruction, 2 less bytes of code and only one register instead of two compared to his original example.

move.l x,d0
add.l y,d0
move.l d0,result

He had just talked about the importance of registers and then showed RISC inefficiency. He is an experienced coder as he ported the OutRun/CannonBall and Strife games to the Amiga.

https://aminet.net/package/game/race/CannonBall
https://aminet.net/package/game/misc/StrifeV1.2

He may lack 68k assembly coding experience or maybe he had to turn off all compiler optimizations to generate 68k assembly code that resembles the original C code but the 68k assembly code still does not match the C code as the x and y variables turned from constants using immediates into global variables. RISC programmers may not have noticed the difference but this is the 68k where it is so easy to program, as he points out in the video, that more than a few 68k fans are likely to notice.

Edit: There is already a comment on the code amongst the many pages of comments already.

galier2 Quote:

Small nitpick at 10:45. Your 4 assembly instructions only cover the 3rd line in C. The lines 1 and 2 are not in your example and would be represented by move.l #2,x and move.l #5,y respectively. Another minor issue, most C compilers on m68k of that era had 16 bit wide int type. So, theoretically you should have used .w as size and not .l, or alternatievely use type long in the C code.


I suspect this guy is correct. The first 2 68k assembly lines with the immediate moves to x and y were omitted and this is GCC generated code with all optimizations turned off. The variables had to be declared as global elsewhere too. The 68k Amiga often used 32-bit integers which resulted in slower and larger code for the 68000 but better code for 68020+. Many other 68k targets used 16-bit integers by default.

Last edited by matthey on 21-Apr-2025 at 09:28 PM.

 Status: Offline
Profile     Report this post  
matthey 
Re: Commodore > Motorola
Posted on 21-Apr-2025 22:37:38
#138 ]
Elite Member
Joined: 14-Mar-2007
Posts: 2624
From: Kansas

Lou Quote:

Did you notice how he mentioned Motorola's 680X line being just 1 customer: GM?
I've said this before. But now that someone else has said it, perhaps now some numb-skulls on this forum will listen to someone who's actually worked for Motorola...specifically their ISG division.


I agree that the 6800 was slighted in the video. It was a significant and influential CPU family with technological advances. The 6502, which likely would not exist without 6800 influence, and later 68000 stole the show and the 6800 was often forgotten. Influential does not necessarily mean good. The Intel 4004, MOS 6502 and Intel 8087 were very influential chips too but had ISA related limitations or handicaps.

Lou Quote:

Did you notice how powerful he said the 68000 is that arcade boards had to use 2 and/or 3 of them?
/facepalm
(while being clocked higher than computers as well...)


The price of the 68000 dropped from roughly $500 to $15. Similar performance CPUs were more expensive so multiple 68000 CPUs were used. SoCs today are usually multicore as well. The 68000 could be clocked up without requiring expensive memory which you do not seem to understand. Using registers gains performance with a higher clock speed even if memory is no faster.

Lou Quote:

As we know, clock for clock - a 68000 can't even add 1+1 as fast as a 6502...meanwhile it also stores it to ram...so this test gimps the 6502 and it still wins...if the 68000 was forced to store the value back to RAM it wouldn't even be close...not that it was actually close...
https://www.youtube.com/watch?v=2k_jP73Ly7A
The comments are gold.


The 68000 is much more flexible than the 6502. If the datatype is 16-bit or 32-bit, the 68000 has better performance. The 6502 has good performance with 8-bit datatypes and small instructions with limited addressing modes using small amounts of memory. The non-orthogonal ISA makes fixing the handicap difficult.

Lou Quote:

Atari Lynx designers, David Needle and R.J. Mical (maybe y'all have heard of them) chose the 6502 instead of 68000 for a couple of reasons.
https://forums.atariage.com/topic/249018-im-fine-with-the-lynxs-cpu-but-why-did-they-pick-a-6502-and-not-a-z80b/#findComment-3438437


Still cherry picking wrong information? The 68000 is much easier to code and has much better code density than the 6502. The 68000 is more memory bus efficient than the 6502 as there is less memory traffic from fetching instructions and GP register to register operations decrease memory data traffic (chipsets/DMA may also use memory so best not to hog the bus and increased memory accesses increase power and decrease performance). The 68000 chips were expensive and large early and more active transistors use more power. The 68k eventually became cheaper and came in smaller packages allowing it to be used in portable devices like Palm PDAs and TI calculators. The 6502 was used in more portable devices than ARM early too. The 6502 would not be considered today unless for retro gaming compatibility.

Lou Quote:

Amiga used the 68000 from 1985 -> 1992 (A600). What a load of bull!
Thanks to Motorola, we didn't get A500/2000's with '020 base cpu in 1987.
Thanks to Motorola, we didn't get a CDTV with an '030 and who could afford an $3,379 A3000(030) in 1990? $3,699 for an A4000(040) in 1992? LOL!


Commodore had nothing to do with it?

Lou Quote:

Bill Mensch figured out Motorola sucked in the 70's. Hence the 6502.
Finally - during 'hombre' development they realized Motorola sucked... Cheaper and better options were available.


The 6502 is an area and power optimized minimalist MPU. It is specialized yet you make it sound like a good general purpose CPU. It was used as a general purpose CPU early only because it was so affordable. The 68k CPUs are infinitely better general purpose CPUs while the 6502 is only useful for specialized embedded MCU use where the code is tiny. Comparing the 6502 to a 68k CPU is like comparing a motor scooter to a general purpose car. A scooter is very efficient in some ways but driving to pick up groceries already shows its limitations.

Last edited by matthey on 22-Apr-2025 at 12:32 AM.

 Status: Offline
Profile     Report this post  
agami 
Re: Commodore > Motorola
Posted on 22-Apr-2025 1:21:17
#139 ]
Super Member
Joined: 30-Jun-2008
Posts: 1929
From: Melbourne, Australia

@thread

MVG just posted a nice retrospective on the 68000

68000 - The CPU ahead of its time (YouTube)


Last edited by agami on 22-Apr-2025 at 01:21 AM.

_________________
All the way, with 68k

 Status: Offline
Profile     Report this post  
matthey 
Re: Commodore > Motorola
Posted on 22-Apr-2025 2:38:39
#140 ]
Elite Member
Joined: 14-Mar-2007
Posts: 2624
From: Kansas

agami Quote:

MVG just posted a nice retrospective on the 68000

68000 - The CPU ahead of its time (YouTube)


IntuitionAmiga posted a link to the same video above in post #135. It is hot.

54k views in 8 hours
98k views in 14 hours

Over 400 comments of mostly positive praise for the 68k too. Lou thinks the 68k is crap though. Motorola threw their beautiful 68k baby away for PPC and Trevor also traded the 68k for PPC with predictable failure. The 68k is the best CPU for education as the assembly code is like a high level language.

rokker333 Quote:

I read 68000. I instantly click. It was the CPU I learned assembly on (Amiga) when I was 16. Such an incredible CPU for the time. So convenient to program on. I tried my first assembly steps on the 6502 and the 68k was like programming in a high level language.


ScottLahteine Quote:

It was truly a nice processor. I went from 6502 coding on the Atari 8-bit to 68000 coding on the Amiga and it was like moving from a shack to a mansion, with its convenient 8 data and 8 address registers. I wrote two complete games on that processor using only DevPac and Deluxe Paint, and maybe someday (after I disassemble those old games) I’ll write more!


markusjohansson6245 Quote:

Im an Amiga kid that spent years programming on the 68k and 68020, demos and such. It was such a fun cpu to program on. As said, the design, instruction set etc just made sense and worked as expected. When the Amiga fell off due to all the reason and I moved on to the pc I tried picking up x86 assembly but it was just so horrible to program on. The dude who designed it probably was on drugs.
Anyway, I have the amiga and 68k party to thank for my currently soon 30 year long career as a developer.


DmitryPuffin Quote:

I learned a bit of assembler on both x86 and 68k while learning at uni. 68k felt more easier to code, for sure.
Also, worth mentioning that it was powering some legendary synthesizers like Fairlight CMI 3 and walrdorf wave.
Truly remarkable piece of computing history.


przemekkobel4874 Quote:

IIRC, for 68k term 'orthogonal design' meant that the CPU could use any data-related instruction on any data register. Addressing modes were limited, depending on instruction itself (so not orthogonal like on some mainframes), but again if there was an address register involved, the instruction could use any of those as well. This was way more flexible than x86. Interestingly enough, tiny 6502 had almost as many addressing modes as 68k (13 vs 14).


Rybagz Quote:

I got into it in 1988 - 68K assembler was like a wish-list for a 6502 programmer (logical since it was successor to 6800 and 6502 a copy of that). It was way better than x86 until the 386 came along. Apple moving away from them around the time Atari and Commodore were dying prettywell meant the end. I have 68000 to thank for me quickly being able to pick up IBM XA/370 Assembly language on mainframes a couple of years later - some of the concepts like base register addressing being similar. I still have Amiga and ST machines though not the time to use them.


leftymuller Quote:

Looking back on this.. the 68k arch and Amiga should of won the PC war !!


f.berger7756 Quote:

I loved this CPU compared to x86 light-years ahead


rztrzt Quote:

Loved the 680x0 processors, assembly on them was a joy.


spiritualastralsoul Quote:

I always hated using Windows 3.1 at college because in comparison to my Amiga A500 it just felt shit, but unfortunately we had to use M$ Office. I know we had Wordworth and could use 720k PC Floppies on the Amiga via CrossDOS, but it was inconvenient. Not only that, but I guess this is why there are so many Mega Drive/Genesis games still being made. The love for the M86k will live on forever. Great video.


dltmap Quote:

Such a love letter to the 68000! Thanks


aquagoose04 Quote:

The 68000 is honestly probably the greatest CPU of all time. Would love to teach myself 68k assembly at some point.


AG-bp3ll Quote:

68000 is a legend. Love seeing a video about it.


StevenJennings-x8w Quote:

Love this video… long love the 68k processor!!! On a side note, this is probably the first video you’ve sounded so “down in the dumps” - hope you’re ok mate?


craigprocter1232 Quote:

I did z80, 6502 and 68000 asm coding and yeah, loved the 68K (demos and trainers mostly) - z80 coding did land me a job programming commercially for the GBC in the early 00s.


JohnnyReb1976 Quote:

The goat. I loved it as a programmer; it was a 16-bit CPU, but you could just ignore that and write 32-bit code. Once true 32-bit CPUs in the 68k series came about the code would just run twice as fast. Brilliant forward thinking on the part of Motorola.


sonic2000gr Quote:

My third and most useful home computer was the 1040 STE. I loved 68K assembly programming. This CPU is a beast.


jcthe2nd Quote:

Love this CPU Amiga forever


IntuitionAmiga Quote:

I’ve recently written a 68020 emulator in Golang for my multi-cpu-arch VM. Such a beautiful architecture which i fell in love with on my A500 in 1989. Some videos of it running test suites on my channel for anyone interested.


The_Conspiracy_Analyst Quote:

Bring back 68k!!!


So much love for the 68k from the programmers! Then there is Lou, the Motorola CEO that threw the 68k away for PPC and Trevor who thought he could throw the 68k away for PPC too. PPC AmigaNOne and Amigaworld.net are dead but the 68k lives on elsewhere. I wish we could take advantage of this 68k love and momentum and bring back the 68k for real instead of ignoring the 68k market.

Last edited by matthey on 22-Apr-2025 at 02:40 AM.
Last edited by matthey on 22-Apr-2025 at 02:39 AM.

 Status: Offline
Profile     Report this post  
Goto page ( Previous Page 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 Next Page )

[ home ][ about us ][ privacy ] [ forums ][ classifieds ] [ links ][ news archive ] [ link to us ][ user account ]
Copyright (C) 2000 - 2019 Amigaworld.net.
Amigaworld.net was originally founded by David Doyle