Jump to content

INTC - Intel


FrankArabia

Recommended Posts

Though to be fair, many people don't need more processing power.  So we could arguably be at the point where not every consumer needs more processing power.

 

This is a good point. The consumer of today worries more about things like efficiency and battery life. While faster processing is a factor in Intel's growth, it's probably not the main concern. I would worry more about desktop/laptop sales. Windows 8 could be what pushes growth again soon, because everyone is just waiting until the release.

Link to comment
Share on other sites

  • Replies 368
  • Created
  • Last Reply

Top Posters In This Topic

Guest rimm_never_sleeps

it used to be the #1 recommended upgrade was a new processor. now it's an SSD. that's a big shift. I upgraded to an SSD. I just don't need a new processor for what I do. And what most people do.

Link to comment
Share on other sites

  • 3 weeks later...

I'm a n00b, so forgive me. But isn't the idea of server virtualization that you will no longer need large, expensive servers but rather can make do with cheaper, commodity servers? If so, doesn't that put a dent into the idea that increasing computing demand for server infrastructure will drive Intel's growth?

 

 

 

Link to comment
Share on other sites

The main benefit of server virtualization is that it can lower payroll costs.

http://4sysops.com/archives/does-server-virtualization-reduce-costs-part-iii-software-and-payroll-costs/

The other articles on that site about server virtualization are good.

 

Hardware costs are about the same if not greater with virtualization.  There's a reason why hardware manufacturers push it.

 

2- One of the trends that is happening is that many desktops in corporate environments are being replaced by thin clients and virtualized desktops on a server.  Hardware costs actually go up.  So that is a trend that is decreasing desktop sales and increasing server sales.

 

The media talks about tablet sales cannibalizing the desktop (/laptop?), but I'm not entirely sure that it's the case. 

Link to comment
Share on other sites

Guest valueInv

I'm a n00b, so forgive me. But isn't the idea of server virtualization that you will no longer need large, expensive servers but rather can make do with cheaper, commodity servers? If so, doesn't that put a dent into the idea that increasing computing demand for server infrastructure will drive Intel's growth?

Not necessarily. It allows you to increase server utilization. For example, in the past, if you had an exchange server using 40% capacity and a web server using 30% capacity, you needed two different machines. Now with virtualization, you can run them on one machine and use the other machine for something else.

 

Virtualization is no new and has been adopted widely already, For whatever reason, it has not put a dent on server demand.

 

However, there are other factors that could effect Intel. If the popularity of wimpy cores increases, it will increase demand for ARM processors or low margin Atom processors.

Link to comment
Share on other sites

I see the end game being a phone in your pocket that can be docked to a couple of monitors and a mouse and keyboard. I think super mobile computing will follow the same trajectory as the desktop computer. Their will be a speed race while keeping power demand in check. The question is which company (I'll group the ARM companies NVDA, Samsung, Apple ect...) will produce the fastest processor with an acceptable power demand. Intel has a clear road map to 14nm with plans to reduce 14 to 7 then 5nm. I like Intel because they already turned the battle ship several years ago heading toward greater power efficiency. They don't really have to do anything special now aside from business as usual: supplying the lucrative (and growing) server market, providing chips for the slowly decaying PC and laptop market, and keeping R&D at the same level they have for years. In 2014 Intel chips will start hitting the mainstream phone market and their further R&D with their tick-tock development scheme should secure their future.

 

Their is a lot of talk about margin compression but the cost of the CPU is quite small when compared to the cost of a smart phone, and I think the consumer would be willing to pay up for a significantly better processor as they have in the PC world for years. Smart phones have the added benefit of having a shorter life cycle than PCs. I've read a lot on here about HPQ and DELL. It seems a lot of people are buying into them because they are cheap and a turnaround seems very plausible. With INTC you can buy into the same "death of the pc" hype but buy a best of breed company with a clear road map to future revenue streams.

 

I see this in a similar way I viewed the BP disaster. Many of investors jumped into BP because it had fallen the most and they believed it would recover. They spent a lot of time estimating the cost of the disaster and verifying BP would be able to cover it without hurting the future of the company. Noble was the best in breed driller and could have been bought at a 25% discount due to the BP disaster. 2.5 years later the investors who took the risk and bought BP and those who chose the best of breed NBL are both up ~50%. Which group investors took the lower risk bet?       

Link to comment
Share on other sites

TSMC is publicly traded and is one of the biggest fab companies out there.  Here is an interesting interview with its CEO:

 

http://focustaiwan.tw/ShowNews/WebNews_Detail.aspx?Type=aECO&ID=201201020022

 

"All of our customers rely on TSMC in foundry production, and Intel relies on its own foundry plants," he said. "If our technologies are not improved enough and Intel keeps improving its technologies, our customers' products will lose competitiveness to those of Intel. It's horrible to imagine the outcome."
Link to comment
Share on other sites

If you are really into "semiconductor" (or chip) technologies, a suggestion is to check out the conference proceedings for IEDM and VLSI Technologies symposium. More or less all the major chip technologies companies publish their "technology platform" papers in these 2 major conferences. 

 

You can compare device performance, metal layer densities, numbers of components offered and so and so forth. If you have contacts in circuit design circle, you should be able to judge the status and the usefulness of these "chip technologies" from IDMs and foundries.

 

http://www.his.com/~iedm/

http://www.vlsisymposium.org/

 

Link to comment
Share on other sites

-Intel actually has the advantage in making low power CPUs since they have cutting-edge fabs.  Their lead in process size and process technology is kind of unfair.

 

There are different markets for "low power" CPUs... servers, notebooks, smartphones, tablets, etc.

 

-The smartphone and tablet ecosystem focuses was historically mostly centered around ARM.  So switching instruction sets is a minor barrier.  Some games run a lot slower on Intel's smartphone than equivalent ARM-based phones because instruction set emulation is slow.

 

If you look at history, Apple was able to switch from the Power instruction set to x86.  Some customers had to wait a few months for x86-compatible software to be released.  Minor technical differences between the two instruction sets (e.g. endianess) means that it takes a small effort to fix bugs; however, it takes some companies a few months to test and release software because pushing out extremely frequent updates is too burdensome (e.g. too many versions out there can cause support issues.  For some niches, users go out of their way to avoid the latest version because they don't want to get burned by new bugs).  The current situation with Apple is that Power is being dropped completely now that the transition is over.

In mainframes, there is entrenchment in whatever the customer's current solution is.

 

Is there a network effect (VHS/Beta) that could lead to this outcome

I think that the strength of a network effect depends on how difficult it is to switch and how difficult it is to accommodate two competing solutions.

 

-In software, it is very easy to switch instruction sets.  It takes very little programming effort to re-compile your program for another instruction set, fix bugs, and ensure decent performance.

-It is hard to make a non-Windows version of your software.  Doing a good multi-platform UI takes additional effort... usually vendors make a UI for Windows and another one for the Mac (i.e. they make two different UIs).

-It is hard to make a non-Windows version of your cutting-edge computer game.  e.g. Macs have a very limited selection of games for this reason.

Link to comment
Share on other sites

-The smartphone and tablet ecosystem focuses was historically mostly centered around ARM.  So switching instruction sets is a minor barrier. 

 

I think you are underestimating how big a deal it is to switch from ARM to x86 on a mobile platform.

 

Apple might be able to pull off such a transition again because they have an ecosystem that is a lot more integrated than others, but I can't imagine Android phone makers easily switching to x86, and I can't imagine an already fragmented ecosystem wanting to support ARM and x86 and make sure that everything is compiled twice or runs on emulators..

Link to comment
Share on other sites

Check out reviews for the Intel-based smartphones out there.  Xolo is one of them.  The phones are here now and few people will notice that it is an Intel chip inside.

 

http://semiaccurate.com/2012/04/26/how-well-does-intels-new-phone-work-as-a-phone/

Other reviewers may like the Intel smartphone a little more.  More so if the reviewer is looking at synthetic tests that have little to do with real world performance.

 

Look... you could actually do some research.

Look at the past.

Look at the present (e.g. Intel smartphones in the hands of reviewers).

Learn something about software development.

Link to comment
Share on other sites

Apple might be able to pull off such a transition again because they have an ecosystem that is a lot more integrated than others,

 

Apple's preference right now is to keep SoC innovation in house for tablets and phones, correct?  Probably a lot easier to work this way when you control the OS.  Wouldn't Google want to bring SoC development in house also?

Link to comment
Share on other sites

Check out reviews for the Intel-based smartphones out there.  Xolo is one of them.  The phones are here now and few people will notice that it is an Intel chip inside.

 

http://semiaccurate.com/2012/04/26/how-well-does-intels-new-phone-work-as-a-phone/

Other reviewers may like the Intel smartphone a little more.  More so if the reviewer is looking at synthetic tests that have little to do with real world performance.

 

Look... you could actually do some research.

Look at the past.

Look at the present (e.g. Intel smartphones in the hands of reviewers).

Learn something about software development.

 

I was making a general statement about switching from one cpu architecture to another.

 

In the past, emulators have always lost to native implementations. It's possible to transition from one to the other, but it must be worth the switch (in Apple's case it was very much worth it).

 

We'll see how it turns out, but emulating ARM in a x86 chip certainly isn't a very elegant solution... First time I ever hear of the Xolo, guess it's not very popular.

Link to comment
Share on other sites

Yes emulation is really slow.  In the future it will disappear.  Developers can create a version for each CPU architecture.  It's a minor inconvenience (the real cost is in testing your program on the wide range of Android devices out there, some of which have GPU bugs).  The Android store will handle the rest and it should be really easy on users:

http://www.embedded.com/design/system-integration/4397060/Getting-rid-of-Androids-fat-APKs-

 

This should be slightly more elegant than Apple's fat binary interim solution when it transitioned from powerPC to Intel.  (Or you could use fat binaries.)

 

- Currently most Android apps don't need emulation on Intel phones since they are programmed with Java.

Link to comment
Share on other sites

Guest rimm_never_sleeps

-The smartphone and tablet ecosystem focuses was historically mostly centered around ARM.  So switching instruction sets is a minor barrier. 

 

I think you are underestimating how big a deal it is to switch from ARM to x86 on a mobile platform.

 

Apple might be able to pull off such a transition again because they have an ecosystem that is a lot more integrated than others, but I can't imagine Android phone makers easily switching to x86, and I can't imagine an already fragmented ecosystem wanting to support ARM and x86 and make sure that everything is compiled twice or runs on emulators..

 

I doubt apple will ever switch. for one they control their SOC designs and "own" them. They never owned the power pc. The economics pointed to a switch to intel on the compute side. However the economics do not point to intel on the mobile side. Samsung has no incentive to switch either. Moto has more as they are owned by goog and it makes sense for goog to support an alternative architecture. It may make sense for nokia and lg. However, intel has not been blessed by msft for inclusion on it's phones. Amazon may buy the TI mobile SOC business. Intel now is understanding what it feels like to be on the outside looking in (on a major wave).

Link to comment
Share on other sites

It's about where the puck is headed right?  What will the smartphone market look like 4 years from now?  I'm going to bet on Intel and its cutting-edge fabs.

 

There have been some companies that have survived Intel's onslaught.  Xilinx and Altera have software development tools for their FPGAs that are hard to duplicate.  The RISC CPU manufacturers had a technology edge... until they didn't.  I don't see what protects the ARM-based smartphone SoC guys.

 

Maybe somebody does a really great job at designing the SoC.  Apple is very sophisticated at designing its own SoCs.  Maybe that will protect them.

All the SoCs out there have specialized portions for video encoding + decoding and doing image processing for the camera.  And most of them license GPU designs from powerVR.  I don't think this area gives anybody an advantage since they can all license the best design.  In the future, the smartphone CPU might become powerful and power efficient enough that the specialized ASICs on the SoC are no longer needed.  Or maybe somebody (e.g. Nvidia) comes out with a really great heterogeneous CPU+GPU to handle what the ASICs do.

Link to comment
Share on other sites

Yes emulation is really slow.  In the future it will disappear.  Developers can create a version for each CPU architecture.

I don't think that is the future,

Currently most Android apps don't need emulation on Intel phones since they are programmed with Java.

I think this is the future. Processing power is becoming a commodity and the trend has been for years to shift away from a focus on efficient programming to ease of development. One of the strong points of Android (and I believe one of the reasons of their success) is that it is relatively easy to write platform-agnostic applications, i.e. it can run on a phone, tablet, pc or television without the developer having to worry about device limitations or interface quirks.

 

From the official android website:

Android also gives you tools for creating apps that look great and take advantage of the hardware capabilities available on each device. It automatically adapts your UI to look it's best on each device, while giving you as much control as you want over your UI on different device types. (..) For example, you can create a single app binary that's optimized for both phone and tablet form factors.

Writing and compiling software for specific CPU's is, for most developers, a thing of the past. Servers are virtualized, websites run on any platform, and the same thing is happening for phones and tablets.

 

The moat of Intel has, as far as my analysis goes, nothing to do with their instruction set. Everybody can copy it. Their moat is that it takes immense amounts of capital and knowledge to fit as many transistors as possible on a very small chip and that they are incredibly good at that.

 

Historically their focus has always been on performance for desktops / servers, where power was not really an issue. But due to the rise of mobile computing performance is perhaps not that relevant anymore. Power consumption is increasingly important and it will take Intel some time to adjust to that (at least that's what I tell myself ;) ).

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...