Sounds more like the era of shitty Intel Motherboards is coming to an end.
Ha!
extra text to fool the filter.
I don’t buy it. The market might contract somewhat but it will be there for quite some time to come… and someone will come up with a new modular system type that allows plugging in the latest chips, or plugging blocks together. If nothing else, there are lots of fab plants out there that are now building things to spec. You would be amazed how powerful a single board computer you can get, and how few are needed to get your own built. A group of hobbyists could do it easily, or a small company. If the ATX goes away, it will just open up the market for cheap modularity to something entirely new.
The future is a PC just large enough for it’s connectors. Prices should come down rapidly.
While that Z3RO looks pretty cool, nobody who games will want it. Nor will anyone who needs more horsepower, like developers.
Nothing prevents somebody from creating a hot game machine with the same form factor (and for dual use it would make a great pocket warmer on cold nights.)
I don’t buy it either. Yeah, so the existing huge motherboard companies will continue to make motherboards. So? And more machines being sold are not upgradable. All that tells me is that people who want any degree of expandability or customization will be more likely to custom-build. So, really more of a divided PC market perhaps, and people who don’t still won’t, but not a lot more than that. What you might see is more of a learning curve for people starting to homebuild, because they won’t be able to start by just upgrading video cards and such.
I doubt it will ever end. It’s too lucrative of a business, the margins on parts for “enthusiast” PCs are crazy, apple-crazy. It will likely become much less common but I doubt it’ll go away. And eventually manufacturing will become far more accessible so you’ll just get people downloading motherboard designs and printing off a couple from some cheap automated manufacturing service.
I’ve been building my own “desktop” PCs for 15 years, but I’ve never used an Intel motherboard and won’t miss them. Gigabyte and Asus are where it’s at.
The only Intel motherboards I’ve bought in recent years are Atoms, which are soldered to the motherboard; even then, one of my Atom systems uses a Zotac motherboard with Nvidia chipset, but it cost twice as much as the Intel ones.
Any opinion on MSI? I’ve been eyeing this for a while.
I can’t speak to all their products, but I had an MSI GeForce 9500 that was pretty nice. Also, my current motherboard (sporting a Trinity A8-5600K) is an MSI and it’s pretty nice.
As the commenter Mark Thompson on that page said, Intel really just outsources to a different motherboard company anyways and then just puts an Intel sticker on it.
When Gigabyte and Asus go away or stop making motherboards I will be more worried.
I’ve been building computers all along – my first while I was in college – the Cosmac Elf off the 1976 Popular Science article.
Perhaps I’m a dinosaur, but I’ve never liked the concept of the CPU AND sound AND video AND gameport all on one motherboard. Every computer I put together always had a video card – and sometimes a sound card – plugged into a motherboard slot. It was really nice that the system was organized so that the on board video was cut out if you added a video card. But I guess I dislike paying for copper traces and video chips I never use.
So I guess you could say that the old S-100 bus concept was my notion of ideal – the motherboard was just a bunch of wires and you had full flexibility to plug in whatever was out there.
But I understand that there are market considerations and a motherboard with all that stuff on it gains economy of scale etc. And I’m not saying the S-100 was the best way from an electrical design/speed standard.
I’m close to building another as my present machine is ancient by computer standards (got it in 2006). And as I’m doing a bunch of video editing I need something with more capacity.
The Elf… programmed in hexadecimal as god intended.
Actually I built them (3 versions) with toggle switches and LED (not 7 segment LEDs) I/O. But I did add a hex keypad later on and the Elf video.
It was a dead end machine but I learned a great deal that has actually helped to this day. A month ago I had to build an interface to a National Instrument 6723 D-to-A/A-toD card to drive ten LED’s (again not 7 segment) which are posing as stars. Then program the NI DAQ to simulate an occultation ( rock passing in front of the star). Same sort of circuit to drive the LEDs.
Yes, I remembering building them, until I realized one day that the parts were more expensive than an off shelf unit. But it was fun.
Actually this shouldn’t be surprising as the era of the home PC is probably coming to an end. The last desktop I bought was 7 years ago and with tablets replacing laptops demand for desktops is declining the same way that mainframes did with the only demand being for commercial uses and by hobbyists. Such is the nature of progress.
Last I looked, PC sales were pretty much flat over the last couple of years and substantially higher than they were ten years ago. Which is to be expected in a mature market. I recently built a new desktop gaming PC for the first time since 2005, but hardware is so cheap now that in the meantime I’ve built two servers and an HTPC.
As for mainframes, aren’t there more operating now than there were in their supposed heyday? They just aren’t sexy any more. Heck, you could argue that the whole tablet/phone/’cloud’ thing is a reversion to the old mainframe era with relatively dumb devices talking to powerful servers.
Edward,
Exactly, they just quietly dropped into the background and were forgotten by all but those who have to or wish to work with them. The same will be true of desktops.
I gave up building my own machines when I realized that, with having to buy my own OS and office suites (no, I was not going to make my spouse use a Linux box for production work), I was paying about $300 more to build my own machine than to buy one from Gateway or Dell or whomever. It no longer made sense.
Then I realized that all I ever really needed to upgrade about my Windows boxes was RAM; I had the disk crash from hell which, despite all of my careful backups, proved not possible to fully recover from due to Windows architecture (aka the Registry) because I had neglected to purchase software to do image backups, and about six months later, I bought an iMac and have never looked back.
It was fun to build those machines, but now I spend those DIY energies on trying out new pickups on my guitar (soldering irons FTW) and home recording. And my computer Just Works (for me, not interested in a Windows vs Linux vs Mac flamewar).
Norton Ghost is your friend.
I’m still building my systems for about 30% less than what a comparable system would cost from a boxmaker. You can find cheap boxes, but these are low end systems thrown together from cheap components of the day. If you’re only running business software, it matters not. If you’re a high-end user (3D modeling, gaming), choosing the right components at the right price makes all the difference in the world.
I only bought a couple of Intel mainboards. One for when the first P4 1.8ghz processor came out. And then another some time after but it lasted 3 days and started blue screening. Mostly Gigabyte and Asus and a couple of SuperMicro’s. I’ve honestly never found building your own system to be any cheaper than just getting a cookie cutter prefab machine. But overall I think you get better reliability out of a DIY machine. Not only because of modularity but also the quality of the components on the motherboard and expansion cards themselves is better in general. But I really haven’t built a computer of my own in about 4-5 years. Just can’t keep up with the rat race of new component cycles every 6-8 months. By the time you pull the trigger on my setup and put it together another “MUST HAVE” processor comes out that of course requires a new socket on the mainboard. So my last upgrade was a Toshiba laptop: Weighs less than 3lbs, runs circles around my last home built PC with it’s quad core i5 560, and Intel has finally made an integrated graphics processor that can handle decent gaming graphics; not to mention full 1920×1080 HD resolutions out the HDMI port.
I could certainly see things heading towards tablet pc’s taking over desktop workstations. With near field communication you could have a docking pedestal that you just lay the tablet on it and it will wirelessly hook to your monitors, peripherals, network, and charge the battery too. I know my little Nexus 7 tablet with it’s nvidia quad core display some serious power when it comes to 3d graphics and number crunching. And with bluetooth keyboard, mouse, and wireless printer it turns into a workstation that can fit in the front pocket of my blue jeans.
It used to be that desktop PC were needed because of their greater performance over laptops, who’s emphasis was on portability over productivity. But that line is blurred a great deal in the last few years with these new generation of processors that can crunch billions of instructions a second without the need of a heatsink/fan. And the SSD drives have brought mass storage down to a portable size. Hell they just came out with a 1 terabyte thumb drive for crying out loud. It won’t be long before desktop PC’s will become some quaint relic that people will cling to as some kitschy novelty.
Desktops, laptops, and tablets are not going to disappear. They are complimentary technologies. What we will see is closer itegration between them. It isn’t an either/or question. People will own all three.
The current group of computer manufacturers can stop making upgradeable systems and other companies will rise up to fill that need.
Well yeah, just like there are people who still build ’30s coupes you will still have enthusiasts and high end gamers going for every last ounce of performance and customization. And I can see labs needing the expansion ports to hook into test equipment. But the overall fact remains that the percentage sales of desktop computers seems to be dropping in relation to the number of tablets and smart phones being sold. People still have a desktop computer but they find themselves logging into them less and less and beginning to not see the reason to upgrade as often or any at all. Or, that’s at least my personal take on it. And since you can turn a tablet into a laptop with a few extra accessories it makes it begin less relevant as well.
The problem I see with no more DIY desktops is that you will be limited to a few machine combinations that the manufacturers decide to build. I can put together many different kinds of desktop PC’s each optimized for a different focus.
But there are only a very few kinds of Macs.
Meanwhile, sales of Arduino, Raspberry Pi, IOIO and other small dev boards are exploding.
That’s because they fill a niche that has existed for some years. Long ago you could buy parallel I/O cards and D-to-A/A-to-D cards for your home computer and then start to control things. But they all went away. Homebrew control has been pretty quiescent for years.
But with things like the Arduino, homebrew data acquisition and control capability has resurfaced. And there are a lot of people who want to use that ability.
I have to admit I’m really thinking about building a DIY robot lawn mower. I have a sliding glass door with clear view of the backyard. I can sit in the air conditioning mowing the grass and never break a sweat. That just sounds too good to be true.
Silly people keep suggesting that speech will replace keyboards… ever since Gary 7 on Star Trek (Teri Garr being unforgettable.)
That’s not so silly at all, Ken. Talk to Siri.
One of the projects I’m working on right now: I took a pair of 3D glasses from the last movie I saw, popped out the lenses, and acquired two old cell phones. I’m going to install the displays and probably some small Fresnel lenses there, and put cams at the corners of the frames, looking forward. Kinekt software will use those cams to look for the fingers and Siri will listen for commands. The user gets a full 3D video and audio experience in one wearable device, controlled by waving one’s hands and speaking. This is all using cheap hardware and existing software libraries. A.C Clarke was right: indistinguishable from magic.
Siri and other voice recognition capabilities aren’t quite there yet. Though they will be in time.
Harder than recognizing the word spoken (with training that’s pretty easy) is to grok the meaning. Computers – home computers and tablets anyways – are still a ways away from hearing the following scenario:
“I went to a restaurant, ordered steak. Paid my bill and left.”
..and being able to answer the question:
“What did I eat?”
I dunno, my 70 something dad just started texting me out of the blue one day. He said he just a iphone. I told him he typing on the touch screen pretty well. He said, “No, I’m just telling Siri to text you and she does it all.” I was like well she doing pretty good then.
But Siri doesn’t understand the text message he’s sending to you.
Some people might call that a good thing. Google on the other hand is like the operator back in the hand cranked phone era. Listening in on your tasty tidbits of gossip and spreading the info loosely around town.
No.
If anything, this is an era of increased choice in PC form factor; in the Old Days (of the VLB-PCI and later era, 1990 or so on) it was ATX or nothing, or maybe mini-ATX.
Now there are plenty of reasonable mini-ITX choices, even for a Real PC, and likewise a number of even tinier form factors for specialized uses.
As long as people need serious power (and both gamers and developers, as well as some other technical users, will continue to), there will be modular PCs.
(That said, the complaint in the article that “you can’t upgrade the CPU” in some of those is, I think, a red herring.
By the time you need a new CPU – if you purchased wisely in the first place – you also need a faster motherboard bus, faster RAM, and everything else, or the new CPU will be bottlenecked.
In all my years of building PCs, I’ve upgraded a CPU in one exactly once – and even then I did it only because I got a great deal on a new CPU and wasn’t quite ready for a new machine.
Furthermore, between ever-faster external buses – like Thunderbolt, with two 10 gigabit PCIE buses in one – and more and more functionality simply built in to every computer, internal expandability is really a niche market, mainly for gamers.)
Sounds more like the era of shitty Intel Motherboards is coming to an end.
Ha!
extra text to fool the filter.
I don’t buy it. The market might contract somewhat but it will be there for quite some time to come… and someone will come up with a new modular system type that allows plugging in the latest chips, or plugging blocks together. If nothing else, there are lots of fab plants out there that are now building things to spec. You would be amazed how powerful a single board computer you can get, and how few are needed to get your own built. A group of hobbyists could do it easily, or a small company. If the ATX goes away, it will just open up the market for cheap modularity to something entirely new.
The future is a PC just large enough for it’s connectors. Prices should come down rapidly.
While that Z3RO looks pretty cool, nobody who games will want it. Nor will anyone who needs more horsepower, like developers.
Nothing prevents somebody from creating a hot game machine with the same form factor (and for dual use it would make a great pocket warmer on cold nights.)
I don’t buy it either. Yeah, so the existing huge motherboard companies will continue to make motherboards. So? And more machines being sold are not upgradable. All that tells me is that people who want any degree of expandability or customization will be more likely to custom-build. So, really more of a divided PC market perhaps, and people who don’t still won’t, but not a lot more than that. What you might see is more of a learning curve for people starting to homebuild, because they won’t be able to start by just upgrading video cards and such.
I doubt it will ever end. It’s too lucrative of a business, the margins on parts for “enthusiast” PCs are crazy, apple-crazy. It will likely become much less common but I doubt it’ll go away. And eventually manufacturing will become far more accessible so you’ll just get people downloading motherboard designs and printing off a couple from some cheap automated manufacturing service.
I’ve been building my own “desktop” PCs for 15 years, but I’ve never used an Intel motherboard and won’t miss them. Gigabyte and Asus are where it’s at.
The only Intel motherboards I’ve bought in recent years are Atoms, which are soldered to the motherboard; even then, one of my Atom systems uses a Zotac motherboard with Nvidia chipset, but it cost twice as much as the Intel ones.
Any opinion on MSI? I’ve been eyeing this for a while.
http://us.msi.com/product/mb/Big-Bang-XPower.html
I can’t speak to all their products, but I had an MSI GeForce 9500 that was pretty nice. Also, my current motherboard (sporting a Trinity A8-5600K) is an MSI and it’s pretty nice.
As the commenter Mark Thompson on that page said, Intel really just outsources to a different motherboard company anyways and then just puts an Intel sticker on it.
When Gigabyte and Asus go away or stop making motherboards I will be more worried.
I’ve been building computers all along – my first while I was in college – the Cosmac Elf off the 1976 Popular Science article.
Perhaps I’m a dinosaur, but I’ve never liked the concept of the CPU AND sound AND video AND gameport all on one motherboard. Every computer I put together always had a video card – and sometimes a sound card – plugged into a motherboard slot. It was really nice that the system was organized so that the on board video was cut out if you added a video card. But I guess I dislike paying for copper traces and video chips I never use.
So I guess you could say that the old S-100 bus concept was my notion of ideal – the motherboard was just a bunch of wires and you had full flexibility to plug in whatever was out there.
But I understand that there are market considerations and a motherboard with all that stuff on it gains economy of scale etc. And I’m not saying the S-100 was the best way from an electrical design/speed standard.
I’m close to building another as my present machine is ancient by computer standards (got it in 2006). And as I’m doing a bunch of video editing I need something with more capacity.
The Elf… programmed in hexadecimal as god intended.
Actually I built them (3 versions) with toggle switches and LED (not 7 segment LEDs) I/O. But I did add a hex keypad later on and the Elf video.
It was a dead end machine but I learned a great deal that has actually helped to this day. A month ago I had to build an interface to a National Instrument 6723 D-to-A/A-toD card to drive ten LED’s (again not 7 segment) which are posing as stars. Then program the NI DAQ to simulate an occultation ( rock passing in front of the star). Same sort of circuit to drive the LEDs.
Yes, I remembering building them, until I realized one day that the parts were more expensive than an off shelf unit. But it was fun.
Actually this shouldn’t be surprising as the era of the home PC is probably coming to an end. The last desktop I bought was 7 years ago and with tablets replacing laptops demand for desktops is declining the same way that mainframes did with the only demand being for commercial uses and by hobbyists. Such is the nature of progress.
Last I looked, PC sales were pretty much flat over the last couple of years and substantially higher than they were ten years ago. Which is to be expected in a mature market. I recently built a new desktop gaming PC for the first time since 2005, but hardware is so cheap now that in the meantime I’ve built two servers and an HTPC.
As for mainframes, aren’t there more operating now than there were in their supposed heyday? They just aren’t sexy any more. Heck, you could argue that the whole tablet/phone/’cloud’ thing is a reversion to the old mainframe era with relatively dumb devices talking to powerful servers.
Edward,
Exactly, they just quietly dropped into the background and were forgotten by all but those who have to or wish to work with them. The same will be true of desktops.
I gave up building my own machines when I realized that, with having to buy my own OS and office suites (no, I was not going to make my spouse use a Linux box for production work), I was paying about $300 more to build my own machine than to buy one from Gateway or Dell or whomever. It no longer made sense.
Then I realized that all I ever really needed to upgrade about my Windows boxes was RAM; I had the disk crash from hell which, despite all of my careful backups, proved not possible to fully recover from due to Windows architecture (aka the Registry) because I had neglected to purchase software to do image backups, and about six months later, I bought an iMac and have never looked back.
It was fun to build those machines, but now I spend those DIY energies on trying out new pickups on my guitar (soldering irons FTW) and home recording. And my computer Just Works (for me, not interested in a Windows vs Linux vs Mac flamewar).
Norton Ghost is your friend.
I’m still building my systems for about 30% less than what a comparable system would cost from a boxmaker. You can find cheap boxes, but these are low end systems thrown together from cheap components of the day. If you’re only running business software, it matters not. If you’re a high-end user (3D modeling, gaming), choosing the right components at the right price makes all the difference in the world.
I only bought a couple of Intel mainboards. One for when the first P4 1.8ghz processor came out. And then another some time after but it lasted 3 days and started blue screening. Mostly Gigabyte and Asus and a couple of SuperMicro’s. I’ve honestly never found building your own system to be any cheaper than just getting a cookie cutter prefab machine. But overall I think you get better reliability out of a DIY machine. Not only because of modularity but also the quality of the components on the motherboard and expansion cards themselves is better in general. But I really haven’t built a computer of my own in about 4-5 years. Just can’t keep up with the rat race of new component cycles every 6-8 months. By the time you pull the trigger on my setup and put it together another “MUST HAVE” processor comes out that of course requires a new socket on the mainboard. So my last upgrade was a Toshiba laptop: Weighs less than 3lbs, runs circles around my last home built PC with it’s quad core i5 560, and Intel has finally made an integrated graphics processor that can handle decent gaming graphics; not to mention full 1920×1080 HD resolutions out the HDMI port.
I could certainly see things heading towards tablet pc’s taking over desktop workstations. With near field communication you could have a docking pedestal that you just lay the tablet on it and it will wirelessly hook to your monitors, peripherals, network, and charge the battery too. I know my little Nexus 7 tablet with it’s nvidia quad core display some serious power when it comes to 3d graphics and number crunching. And with bluetooth keyboard, mouse, and wireless printer it turns into a workstation that can fit in the front pocket of my blue jeans.
It used to be that desktop PC were needed because of their greater performance over laptops, who’s emphasis was on portability over productivity. But that line is blurred a great deal in the last few years with these new generation of processors that can crunch billions of instructions a second without the need of a heatsink/fan. And the SSD drives have brought mass storage down to a portable size. Hell they just came out with a 1 terabyte thumb drive for crying out loud. It won’t be long before desktop PC’s will become some quaint relic that people will cling to as some kitschy novelty.
Desktops, laptops, and tablets are not going to disappear. They are complimentary technologies. What we will see is closer itegration between them. It isn’t an either/or question. People will own all three.
The current group of computer manufacturers can stop making upgradeable systems and other companies will rise up to fill that need.
Well yeah, just like there are people who still build ’30s coupes you will still have enthusiasts and high end gamers going for every last ounce of performance and customization. And I can see labs needing the expansion ports to hook into test equipment. But the overall fact remains that the percentage sales of desktop computers seems to be dropping in relation to the number of tablets and smart phones being sold. People still have a desktop computer but they find themselves logging into them less and less and beginning to not see the reason to upgrade as often or any at all. Or, that’s at least my personal take on it. And since you can turn a tablet into a laptop with a few extra accessories it makes it begin less relevant as well.
The problem I see with no more DIY desktops is that you will be limited to a few machine combinations that the manufacturers decide to build. I can put together many different kinds of desktop PC’s each optimized for a different focus.
But there are only a very few kinds of Macs.
Meanwhile, sales of Arduino, Raspberry Pi, IOIO and other small dev boards are exploding.
That’s because they fill a niche that has existed for some years. Long ago you could buy parallel I/O cards and D-to-A/A-to-D cards for your home computer and then start to control things. But they all went away. Homebrew control has been pretty quiescent for years.
But with things like the Arduino, homebrew data acquisition and control capability has resurfaced. And there are a lot of people who want to use that ability.
I have to admit I’m really thinking about building a DIY robot lawn mower. I have a sliding glass door with clear view of the backyard. I can sit in the air conditioning mowing the grass and never break a sweat. That just sounds too good to be true.
Silly people keep suggesting that speech will replace keyboards… ever since Gary 7 on Star Trek (Teri Garr being unforgettable.)
That’s not so silly at all, Ken. Talk to Siri.
One of the projects I’m working on right now: I took a pair of 3D glasses from the last movie I saw, popped out the lenses, and acquired two old cell phones. I’m going to install the displays and probably some small Fresnel lenses there, and put cams at the corners of the frames, looking forward. Kinekt software will use those cams to look for the fingers and Siri will listen for commands. The user gets a full 3D video and audio experience in one wearable device, controlled by waving one’s hands and speaking. This is all using cheap hardware and existing software libraries. A.C Clarke was right: indistinguishable from magic.
Siri and other voice recognition capabilities aren’t quite there yet. Though they will be in time.
Harder than recognizing the word spoken (with training that’s pretty easy) is to grok the meaning. Computers – home computers and tablets anyways – are still a ways away from hearing the following scenario:
“I went to a restaurant, ordered steak. Paid my bill and left.”
..and being able to answer the question:
“What did I eat?”
I dunno, my 70 something dad just started texting me out of the blue one day. He said he just a iphone. I told him he typing on the touch screen pretty well. He said, “No, I’m just telling Siri to text you and she does it all.” I was like well she doing pretty good then.
But Siri doesn’t understand the text message he’s sending to you.
Some people might call that a good thing. Google on the other hand is like the operator back in the hand cranked phone era. Listening in on your tasty tidbits of gossip and spreading the info loosely around town.
No.
If anything, this is an era of increased choice in PC form factor; in the Old Days (of the VLB-PCI and later era, 1990 or so on) it was ATX or nothing, or maybe mini-ATX.
Now there are plenty of reasonable mini-ITX choices, even for a Real PC, and likewise a number of even tinier form factors for specialized uses.
As long as people need serious power (and both gamers and developers, as well as some other technical users, will continue to), there will be modular PCs.
(That said, the complaint in the article that “you can’t upgrade the CPU” in some of those is, I think, a red herring.
By the time you need a new CPU – if you purchased wisely in the first place – you also need a faster motherboard bus, faster RAM, and everything else, or the new CPU will be bottlenecked.
In all my years of building PCs, I’ve upgraded a CPU in one exactly once – and even then I did it only because I got a great deal on a new CPU and wasn’t quite ready for a new machine.
Furthermore, between ever-faster external buses – like Thunderbolt, with two 10 gigabit PCIE buses in one – and more and more functionality simply built in to every computer, internal expandability is really a niche market, mainly for gamers.)