Subj : Re: virtual addresses To : comp.programming From : Ed Prochak Date : Wed Sep 28 2005 07:12 am Brian wrote: [] > > I offer that I use XP with the same OS version, "nearly" the same > software, the same OO paradigm that's been around for the past 10 > years or more. Microsoft's stuff has been Win32 and MFC (wrapper) > for ages, hasn't it? > > Same libraries. Same OS. How does this add up to bigger apps > exactly? Office still fits on the same ISO cd it did in 1995. > > Meanwhile, memory has grown by something pretty close to Moore's > law. In 1995 16 MB was aggressive. Using Moore's law, 16*2^5 > equals 512 MB today. On paper. > > But I'm seeing 2GB being used quite commonly. Memory has actually > grown faster than Moore's law. > > Exponential memory growth. Constant or linear application > growth. You can see why I don't buy the argument. memory is occupied by codee and data. You program code may be the same size but data size is continually growing. Simple example. How may jpeg images did you have on your PC in 1995? and images are growing in size very fast. Both resolution and color depth have grown. So where once images were 300DPI black and white (1bit per pixel), we now have 2400DPI full color (32bit per pixel) images. do teh math. Doubling the color depth doubles the memory required. That's linear. But at the same time we are doubling the resolution, which by itself quadruples the memory required (resolution is expressed as linear DPI, but remember, the image exists in 2dimensions, so changing from 300DPI to 600DPI increases the size of the image by 4times. If nothing else, pixels will consume memory as fast as it is available. And just wait for when we get 3D holographic displays. It's all about the data. Ed > The entire question boils down to this. What type of memory > is faster? Is there enough of it? > > And I think that, perhaps on a tunable level, system memory is > plentiful enough to do away with the hard disk. The problem is, > whenever the swap file is eliminated, performance does indeed > suffer. maybe in XP, but UNIX systems don't seem to have that problem. Discless workstations from SUN work quite well. > > The only reasonable explanation is that the virtual memory > manager degrades at the boundary case - similar to zip compression > creating a bigger archive file than the raw input. maybe the one in XP does. > > That's the part that hasn't been explained to me. I'll dive into > Linux or OpenSolaris one day and figure it out -- assuming swap > files aren't eliminated in the meantime. Oh there will always be plenty to fill it up. Consider that the first PC's did straight typing essentially. Then printing was with variety of fonts, but on screen displays still showed plain text with embedded formatting. Then WYSIWYG displays meant more memory for both the document and the display. Then add still images. Next PC's developing high resolution web pages with animated graphics images. and now a typical PC is at least downloading and displaying video, if not editting the video. (So the growth of memory required for images is now extended into a third dimension, time.) And have you not seen the advancements in computer games? Even quicken, the once simple check book program now includes options for tracking different account types (savings, CD's, and other investments), producing graphs and other reports. So yeah, if you do not add any new applications, and do not even visit newer web pages (And don't install any new plugins for your browser), and do not write documents any bigger or using newer fonts or images than what you did before, then your XT system should work fine in the 2GB machine without swap space. Or if the memory manager is broken, configure a 1GB ramdisc for the swap file, and watch it hum. But the data you use today is nothing like the data you used yesterday. Hope that explains the reason for needing that extra RAM. Ed .