You have commented 339 times on Rantburg.

Your Name
Your e-mail (optional)
Website (optional)
My Original Nic        Pic-a-Nic        Sorry. Comments have been closed on this article.
Bold Italic Underline Strike Bullet Blockquote Small Big Link Squish Foto Photo
Science & Technology
16 Gigabyte Chip Now in Production
2007-05-03


Samsung Electronics has begun mass production of a 16 Gbyte NAND flash memory chip that would be used in digital music players, music phones, and digital cameras. Samsung, which is making the chips using a 51-nanometer manufacturing process, said it is the first to mass-produce what it claims is the "highest capacity memory chip now available. In rolling out the densest NAND flash in the world, we are throwing open the gates to a much wider playing field for flash-driven consumer electronics," Jim Elliott, director of flash marketing for Samsung Semiconductor, said in a statement issued Sunday.

Samsung said its 51-nanometer production process can make NAND flash chips 60% more efficiently than the typical 60-nanometer process used in the industry. In addition, the new production process accelerates the read-write speeds by about 80% over current data processing speeds for comparable chip designs. Samsung plans to integrate the chip with a suite of Flash software and firmware for storage devices for music phones and MP3 players. As the demand for video content grows, Samsung expects to promote the chip for storage in mid- to high-range digital cameras. The company expects the high-capacity chip to enter the mainstream market beginning late this year.

The latest product follows by about eight months Samsung's launch of production of a 60-nanometer 8-Gbyte NAND flash memory chip. Samsung has been pushing the envelope in flash technology. In March, the company introduced a 64 Gbyte solid-state flash drive for ultra-portable notebooks. The South Korean company unveiled the 1.8-inch drive at its annual Mobile Solution Forum in Taipei, Taiwan, and said it planned to start mass production in the second quarter.
For someone who remembers ferrite core memory planes, 16Gb on a single chip is simply stupendous. Were it not for the advent of digital photography and music downloads, a single one of these chips could probably hold every single document, spreadsheet and digital record that the average household would produce in a human lifetime. I can still remember the first chip memory devices. A Texas Instruments catalog that I taught myself digital electronics out of proudly announced a 16 bit register file which could simultaneously read and write. Later whopping 1K serial DRAMs came to market and it's been upwards and onwards ever since.

While I was working during the late 1970s with Tegal Corporation to develop plasma etching and deposition chemistries, the buzz was all about integrated citcuits made with micron (1 millionth or 0.000001 Meter) and sub-micron line widths. The Samsung device is now utilizing 60nM (60 nanometer or billionths of a meter) for its line widths. Thats 0.000000060 of a meter wide. By comparison a human hair ranges from 0.000180 to 0.000018 of a meter in diameter (18 to 180 microns).

During the mid-1980s I worked at Intel performing yield analysis for their stupendous 256K C-MOS DRAM. The largest of its kind at that time, it represented a huge leap ahead (despite other larger memory chips), due to C-MOS's low power consumption. This important property increased battery lifetimes then, as now, a major limiting factor in overall performance. To give an understanding of how small the memory cells were in the 256K DRAM chip, we had to switch over to an ultralow contaminant molding compound for encapsulating the chips because it was found that stray radioactive boron in the plastic's glass reinforcing fibers were emitting alpha particles which carried enough charge to flip the individual memory bit cells and create soft errors in the stored data.

Much as the hard drive's technological demise has been predicted over and over again, so has that of silicon technology. Especially the fabrication method known as photolithography. It is identical to the process whereby the copper layer of printed circuit boards is patterned using a photo-sensitive resist, only the scale is incredibly smaller. During the micron line width days of silicon, ordinary light was used to expose the photoresists employed in photolithography. As the linewidths increasingly approached the actual wavelength of light, scattering and reflectance notching all begin to interfere with accurate replication of the masking image being projected onto the wafer.

New approaches using E-Beam (electron beam) and other ultra-narrow beam technologies were developed to circumvent this issue. Remarkably enough, just like the disk drive, which continues to amaze everyone with its longevity, so has photolithography been given new life through the use of Deep UV (Ultra Violet) illumination for developing the resist patterns. It is most likely what Samsung is using to fabricate these little wonders and I am obliged to congratulate them for this achievement. I still have to wonder if they are depositing tungsten conductor layers for this chip using the CVD reactors I used to work on for one of their OEM suppliers.
Posted by:Zenster

#20  Having a shortage of Beethovens in the programming community, would probably mean that we need a program that writes such software.

Very interesting and apt analogies, 'moose. The biggest problem with having "a program that writes such software", is encapsulated in the old addage, "GIGO" (Garbage In - Garbage Out". Namely, the scribe program is only as good as its originators. Much of what you, very wisely, dance around is related to AI (artificial intelligence) or "machine intelligence". Little short of an AI engine would be required for "listening to an orchestra--not only hearing different instruments individually, but operating in concert".

Perhaps if a conventional engine were given a replete vocabulary and sensing profile of orchestral music characteristics (i.e., adagio, allegro, concerto, sonata, fugue, canon, etc.), it might be able to provide discriminative information, but certainly not the emotional content that symphonic pieces convey. Consider for a moment Beethoven's 6th "Pastoral" Symphony. Not even the most advanced modern computer could divine or identify the allegro "storm" passage as such, while a human ear can readily discern its tempestuous overtones.

As for advanced "multi-dimensional" processing, I would look towards optical computing. There have already been demonstrated optical processors that can compare two different time frames of the same scene and almost instantly (i.e., at light speed), differentiate even very slight variations between the two. Such capability would prove invaluable in FoF (Friend or Foe) combat indentification, real-time motion detection and so forth.

Extremely nuanced input discrimination still awaits much more refined voice interpretation software and pattern recognition macros. Consider how the human mind is estimated at possessing approximately one terabyte of storage capacity. We now have disk drive arrays with that sort of memory space. Now, consider how the human eye operates. Long before a signal reaches the brain's optical cortex, it has already been subjected to numerous conformality tests. The retina itself has built in networks that test for perpendicularity, roundness, squareness, planarity and numerous other geometric qualities.

Now imagine the immense difficulty in recreating olfactory, aural and tactile inputs for these advanced computing systems. Until so many of these other quasi-human traits are reliably incorporated, or software simulated, in these computing systems, it will be difficult to obtain the "jumps in performance" being sought.

Bulk information processing, especially as it stands, will never surpass human ability at synthesizing disparate sensory inputs.
Posted by: Zenster   2007-05-03 22:11  

#19  Mark E "Can it make my word documents go back in time?"

I could have used one of those
Posted by: Dan Rather   2007-05-03 20:59  

#18  Mark E: Programming today is either linear or parallel, a limitation of digital processing. To start with, we are reaching a point with hardware that can actually process analog data.

An analogy of this would be listening to someone play the trumpet as serial processing. Being able to distinguish between eight trumpeters playing at the same time would be parallel processing.

But multi-dimensional processing would be like listening to an orchestra--not only hearing different instruments individually, but operating in concert. Vastly more information is conveyed, but it is organized as a whole.

But programming for this, keeping the analogy, would be the difference between writing an instrument solo and writing a symphony.

Having a shortage of Beethovens in the programming community, would probably mean that we need a program that writes such software.
Posted by: Anonymoose   2007-05-03 19:36  

#17  "multi-dimensional jumps in performance."

Can it make my word documents go back in time?
Posted by: Mark E.   2007-05-03 18:43  

#16  Here's a good question: what do you think will be the "next wave" in computer technology?

Right now, I see the biggest lag as being in software. Unlike the new and better hardware, which to a great extent is being designed by computer, software is still being done by teams of people.

This means that it seems to be stuck in development, only incrementally changing in a rather linear manner, instead of by leaps and bounds with multi-dimensional jumps in performance.
Posted by: Anonymoose   2007-05-03 18:05  

#15  #14: "did you even bother to read the article's first sentence?"

Geez, Zen - aren't you asking a lot of EP?
Posted by: Barbara Skolaut   2007-05-03 17:25  

#14  Go back to the original article.

Sure thing. Now, did you even bother to read the article's first sentence?

16 Gbyte NAND flash memory.
Posted by: Zenster   2007-05-03 16:47  

#13  Go back to the original article.
16Gb = 16 gigabit = 2 gigabyte
Posted by: Enver Phuns6977   2007-05-03 16:24  

#12  This is way past me; I am still the 'can I get a four barrel carb for my Ford? era'
/small child voice: 'what's a carb, Grampa?'
Posted by: USN. Ret.   2007-05-03 14:46  

#11  Oh... and does this mean I can get a 200GB stick of RAM soon?
Posted by: DarthVader   2007-05-03 12:51  

#10  Ah, geek memories...

The first computer I used in a serious way was a microVAX with 300M of disk space. I once went to a computer users' group meeting where we determined that the combined storage space of the campus VAX cluster was THREE HONKIN' GIG! Woo hoo!

Soon they'll be giving that much away in cereal boxes.
Posted by: Angie Schultz   2007-05-03 12:23  

#9  The old IBM drives (I forget the number but late 60s early 70s) had oil based servos. You needed pans under them to collect the leaks...

Posted by: 3dc   2007-05-03 12:00  

#8  I remember replacing a 10 megabyte drive on a Prime computer. The drive was about one foot by two feet. The service tech asked if anyone wanted it because he really didn't want to hump it out to the car. No takers.

And that was advanced technology.

I once was regaled by one of the first DEC service engineers: A giant two megabyte drive's spindle failed, and it started jumping around, walked sideways, and blocked the computer room door. Then it caught fire. Things got exciting!
Posted by: KBK   2007-05-03 11:21  

#7  What's the max read/write to a block before failure count?

For read replacement of hard drives these chips make sense BUT for write (esp window's stupid registry and hives - now if you had a big block of cmos and windows was smart enough to use it for the registry and hives and only wrote it to flash once in awhile) it can be an application killer.

Posted by: 3dc   2007-05-03 11:12  

#6  Damn right Darth. I once heard that p0rn0graphers drove the video tape market and drug dealers drove the cell phone market.
Posted by: Spot   2007-05-03 11:02  

#5  Once again, entertainment is pushing the tech market. Let's hear it for gamers and geeks!
Posted by: DarthVader   2007-05-03 10:27  

#4  Zen,

Last I was paying attenting, Intel and others are deep into X-ray litho and beyond now (still research land)?

The multi-cores of today are effort to maintain Moore's law while the next litho tech leap is made.

However, there is the limit to litho, so what is coming down the line, setting aside the multi-cores?
Posted by: bombay   2007-05-03 09:47  

#3  My first Jump Drive was 8 Mb. My friends were amazed. Actually they asked me why I carried it around.
Posted by: bigjim-ky   2007-05-03 07:18  

#2  the plastic's glass reinforcing fibers were emitting alpha particles

Ohmygosh! Zenster! Is that, like, the radioactive alpha particle? You mean all my chips are all, like, radioactive?

Aieeeeeeee!!! [runs screaming down the hall and out of the building]
Posted by: Bobby   2007-05-03 07:11  

#1  Speaking of "Now in production" > FREEREPUBLIC - TENET [60 Minutes interview ]: CIA-INTELLIGENCE COMMUNITY BELIEVED SADDAM WOULD HAVE HAD NUCLEAR WEAPON BY 2007-2009.
Posted by: JosephMendiola   2007-05-03 03:26  

00:00