Jump to content

Pc Replacement And Windows 10


Kira

Recommended Posts

  • 2. Administrators

Kira, it's highly unlikely that a modern mobo would have the memory slots placed in such a way that the memory slots are fouled by larger gfx cards. That said it could be pretty tight. The Shark case is so big that HDD placement is unlikely to be an issue, but in any case you've got room to move them if necessary. My 580 gets in the way of a HDD slot on my rig but I've had to go for a smaller case so things are tighter anyway

 

EDIT: This might be an issue on micro ATX boards

Link to comment
Share on other sites

  • Replies 83
  • Created
  • Last Reply

Top Posters In This Topic

Yup, I've heard that about at least the 770 from reviews.  Shades of "My god this things huge!  It barely fit in my case!" etc.  I have all kinds of room in the Shark that isn't being utilized.  Still, I need to be careful about placement as I really don't want to go through what some have on their first build with regards to X doesn't fit with Y, but neither does Z, try W, etc and so on.

 

Talked to someone who knows far more about power than I.  He said that a 750W power unit is capable of up to 750W, but isn't putting that out all the time.  Just as I suspected.

 

The other day, a friend gave me some RAM.  Unfortunately it wouldn't fit in my current board... and I now I think I know why.  I have 3x2GB sticks of Kingston 2GB 2Rx8 pc3-8500U (I'm pretty sure DDR3) RAM!  Now to find out for certain.  If so, and I can match it with this new board, I may go with just one 4GB stick, and use these as well, totaling 10GB.  I've heard that can be a problem with boards with four slots, though.  Basically they run #1 and #3 slots paired, and #2 and #4 slots paired.  So I could do 2 and 2, but not 1 and 3.  I'm currently running 2 and 2 on my board in this machine with 1 and 2 the same, 3 and 4 the same.  No issues.  Perhaps better would be 4+4+2+2=12GB, half of it new.

Link to comment
Share on other sites

Getting close here.  I think I've found the graphics card.  Scratch that, I need to do some more searching in this sector.  Not that it's a bad choice, just need to make sure it's the best choice.  I was considering the 760 X2, but I think that power consumption trumps card capability in this case.  It's over twice the power for roughly twice the capability in some areas.  If I ever wanted that much more, I could just get another 960 in SLI, right?  I'd certainly save enough in energy in the mean time.

 

http://gpuboss.com/gpus/GeForce-GTX-960-vs-GeForce-GTX-760-X2

 

Regarding RAM, the options:

 

(% costarrow-10x10.png is regarding GB to $ ratio, which I've found is close to 100% for most at $10 per GB)

 

1x 8GB $64 = 80% costarrow-10x10.png 1333 DDR3 (GSkill)

or

1x 8GB $82 = 102% cost 1600 DDR3 (GSkill)

or

1x 8GB $67 = 85% cost  1600 DDR3 (Corsair)

or

2x 4GB $65 = 81.2% cost 1333 DDR3 (GSkill)

or

2x 4GB $61 = 76% cost 1866 DDR3 (Kingston)

 

Being so much lower cost for higher capability, is there something wrong with Kingston?  Just less advertising?  Why's their product cheaper?

 

I've run across information stating that you don't really need the 1600, that 1333 will do just fine, thanks.  Not being a nitpicker on these things, I'm inclined to forgo the 1600... but only if there's something seriously wrong with Kingston RAM.  Considering the 100% positive review rate on Newegg, I'd say that's not likely.

 

Major question remains:  1x 8GB with future options up to 32GB (MB maximum) or 2x 4GB per the problems Jabo had with the 8GB ram sticks...?

Link to comment
Share on other sites

  • 2. Administrators

Yup, I've heard that about at least the 770 from reviews.  Shades of "My god this things huge!  It barely fit in my case!" etc.  I have all kinds of room in the Shark that isn't being utilized.  Still, I need to be careful about placement as I really don't want to go through what some have on their first build with regards to X doesn't fit with Y, but neither does Z, try W, etc and so on.

 

Talked to someone who knows far more about power than I.  He said that a 750W power unit is capable of up to 750W, but isn't putting that out all the time.  Just as I suspected.

 

The other day, a friend gave me some RAM.  Unfortunately it wouldn't fit in my current board... and I now I think I know why.  I have 3x2GB sticks of Kingston 2GB 2Rx8 pc3-8500U (I'm pretty sure DDR3) RAM!  Now to find out for certain.  If so, and I can match it with this new board, I may go with just one 4GB stick, and use these as well, totaling 10GB.  I've heard that can be a problem with boards with four slots, though.  Basically they run #1 and #3 slots paired, and #2 and #4 slots paired.  So I could do 2 and 2, but not 1 and 3.  I'm currently running 2 and 2 on my board in this machine with 1 and 2 the same, 3 and 4 the same.  No issues.  Perhaps better would be 4+4+2+2=12GB, half of it new.

 

Just to clarify, on the majority of mobos with 4 DIMM slots, they are organised into two banks (numbered 1 & 2) each bank containing 2 slots (DIMM0 & DIMM2 and DIMM1 & DIMM3). My approach is to ensure that I have identical RAM in a bank but this can differ between banks so using your example I would have;

 

RAM Slot :        0    1    2    3

RAM Capacity: 4    2    4    2

 

Here's the thing - most modern mobos are far more tolerant of mixing and matching RAM than older ones, but as I am an old-school builder I tend to stick to doing things the way I know works and not being too adventurous.

 

Getting close here.  I think I've found the graphics card.  Scratch that, I need to do some more searching in this sector.  Not that it's a bad choice, just need to make sure it's the best choice.  I was considering the 760 X2, but I think that power consumption trumps card capability in this case.  It's over twice the power for roughly twice the capability in some areas.  If I ever wanted that much more, I could just get another 960 in SLI, right?  I'd certainly save enough in energy in the mean time.

 

http://gpuboss.com/gpus/GeForce-GTX-960-vs-GeForce-GTX-760-X2

 

Regarding RAM, the options:

 

(% costarrow-10x10.png is regarding GB to $ ratio, which I've found is close to 100% for most at $10 per GB)

 

1x 8GB $64 = 80% costarrow-10x10.png 1333 DDR3 (GSkill)

or

1x 8GB $82 = 102% cost 1600 DDR3 (GSkill)

or

1x 8GB $67 = 85% cost  1600 DDR3 (Corsair)

or

2x 4GB $65 = 81.2% cost 1333 DDR3 (GSkill)

or

2x 4GB $61 = 76% cost 1866 DDR3 (Kingston)

 

Being so much lower cost for higher capability, is there something wrong with Kingston?  Just less advertising?  Why's their product cheaper?

 

I've run across information stating that you don't really need the 1600, that 1333 will do just fine, thanks.  Not being a nitpicker on these things, I'm inclined to forgo the 1600... but only if there's something seriously wrong with Kingston RAM.  Considering the 100% positive review rate on Newegg, I'd say that's not likely.

 

Major question remains:  1x 8GB with future options up to 32GB (MB maximum) or 2x 4GB per the problems Jabo had with the 8GB ram sticks...?

 

As far as slapping in an extra gfx card and going SLI - I don't know it's as easy as all that - particularly as I think you need two PCI-ex16 slots rather than one PCI-ex16 and one PCI-ex4. The slots may look the same but that doesn't mean they are the same. I wouild be inclined not to worry about this at all and buy the best graphics card you can afford and stick with that. Oh and your two cards would need to be identical right down to the manufacturer.

 

As for why Kingston is cheaper - who knows? They have a deserved reputation for solid, reliable products - that's all I would consider.

Link to comment
Share on other sites

The mobo quoted by NewEgg is an SLI board but i wouldn't really bother too much about that.  I'm with Jabo; go for the best single card you can afford.

 

Kingston memory will be fine.

 

Oh yeah; avoid anything that touts itself as "military grade".  The reality is contractors have a habit of selling the military any old shit 'cause they mostly get away with it! 

Link to comment
Share on other sites

Sounds good.  I agree about SLI, feeling that if I'm going to spend twice the money, I'd better get twice the performance, which isn't always the case.  As already stated, I like pretty, but I'm no performance freak!

 

As I heard a DI say once: "Remember kids, if it's military, it's government, and thus made by the lowest bidder!"  This was in regard to mil-spec "Birth Control Goggles" glasses with atrocious tolerances for lenses that resulted in headaches, dizzinessarrow-10x10.png, etc. by the poor new recruits who had to wear them due to poor eyesight.  I'm sure they didn't help much.

Link to comment
Share on other sites

How do I determine which direction airflow is going to be in the graphics card (before buying it) for airflow management?  I'd hate to have it running downwards and dumping all that hot air right onto the sound board.  Can you say "meltdown"?  I guess it could go out the mesh sides of the case.  The side opposite the board is essentially open on the Thermaltake Shark.  It's still going to heat that sound board a bit if it's venting downward.

 

My setup is low mount fan pulling air in in the front, mid mount fan pulling air out in the back.  My CPU fan is mounted directly in front of the "dump" fan in the back, but at a 90 degree angle, pulling air onto the CPU.  I do not know, but strongly suspect that this will be the same for my new CPU fan.  Additionally, my power supply is mounted at the top of the case in the rear, directly over the "dump" fan, with airflow into the case, which should then be pulled right out by the "dump" fan.  Oddly, this is a configuration that TomsHardware doesn't cover.  They do both direction mounts for bottom mount PSU, but only one for the top, which is pulling heat from the CPU and dumping it out the back. 

 

Oh, and of course the dumped hot air, since the fan is directly below the PSU pickup fan, rises and gets pulled right back into the PSU.  Great design!

 

If the GPU is mounted in the same place as my current card (likely?), then the fan, assuming it pushes air up, will dump hot air from the GPU directly in front of the CPU fan, which will then pull that nice hot air onto the CPU if the big 120mm "dump" fan in the back doesn't pull that hot air out first.

 

Here's a good TomsHardware article with schematics illustrations explaining my concerns much better than I can. Did I mention illustrations?  Definitely worth a thousand words.

 

http://www.tomshardware.com/reviews/cooling-airflow-heatsink,3053-11.html

 

Oh, and bytheway...  I've been looking at power supply unit cooling as well.  Mine (the one that I got built for me 10 years back) is, well... funky, to say the least.  It's a top mount, at the back of the case.  They mounted the thing such it draws air from outside the case, runs in through the PSU and dumps the hot air, you guess it!  Inside the case.  Directly over the CPU fan, which then pulls said hot air straight in to the CPU before that good old 120mm "dump" fan can grab it.  Way to go guys!  Some expert PC builders you are.  No wonder my CPU temps were off the charts.

 

I'm going to have to do some serious searching to see if I can find a way to change that on this new build's PSU using the same case (I hope).  That's not cool.  Literally.  If I can't, I might be looking at the case market, too.  Hate to do that given what I paid for the thing in the first place.

 

A possible solution would be to somehow mount fans on the side over the mesh that comprises 40-50% of the side of the case opposite the motherboard.  The questions would be:

 

1. Mounting.  About the only way to do so would be with zipties or the like.  Not exactly vibration free.

 

2. Power.  I'm sure I'd have to get power wire extensions for those fans to get power from the board.

 

3. Direction of flow.  Do I pull hot air out, or push more cool air in?  I'd think it better to push more cool air in.  Pulling might take the cool air from the front of the unit before it had a chance to do its job.

Link to comment
Share on other sites

  • 2. Administrators

You won't need to use a soundcard with the new mobo. It's all built in!

The usual airflow direction with a gfx card is cool air in at the front, passing thru the heatsink and blown backwards out of the back of the machine and down by the fans.

As far as airflow in general is concerned I have a 120mm fan in the front of the case to draw in cool air and another 120mm in the back to exhaust the hot air. When I had a shark case I actually blocked the mesh side with a sheet of acetate film to try and smooth the airflow as far as possible. Never had a component overheat though so I doubt you'll have much of a problem.

 

Standard mount for a PSU like yours is to draw warm/hot air from inside the case and exhaust it out the back - why anyone would decide to do anything else is a bit of a mystery. A lot of modern cases mount the PSU at the bottom of the case with a vent beneath it so the PSU draws air from outside (under the case) and blows it out of the back - keeps the PSU cooler and prevents other components being affected by generated heat. Again though don't worry about this - the shark's top-mount PSU bracket will be fine with your new PSU.

 

Don't forget to pay careful attention to routing of cables inside the case when you build this, it's a pity to have paid attention to the cooling only to have a bird's next ot wiring in the case blocking the airflow.

 

Sent from my iPhone using Tapatalk

Link to comment
Share on other sites

Well, looks like I screwed up by waiting.  The graphics card I was looking at just went up $30.  Damn!

 

http://www.newegg.com/Product/Product.aspx?Item=14-125-770

 

Note the lack of fans out the back... but an extra fan in the middle.  I don't think this one pulls out of the case through the card, I think it just moves more air with the three fans rather than two, and sends it up to be pulled out by the exhaust fan.

 

The other option I'm considering:

 

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487091&cm_re=GTX_960-_-14-487-091-_-Product

 

I have an EVGA card now, and have had no issues with it.  I am extremely leery of the replacement policy, however.  It seems that if you pay them for a new card, and the one they send fails, is DOA, or otherwise gets returned within the return time, they send you a refurbished card as a replacement.  Talk about a major rip off!  I guess I could just get a refund and try again later so as not to get stuck with a rebuild for new price if something goes wrong.

 

Unfortunately, I'm looking at what was my first choice and seeing some issues there, too.  Specifically, the fact that it doesn't vent out the back, as most do.  Seems odd.  Also, the price went up, and they tacked some major shipping on it, too, so it's now pretty much outside my window for a "reasonable" card.

Link to comment
Share on other sites

  • 2. Administrators

None of the current graphics cards have rear fans Kira, they may have a grille where some of the air moving sideways off the fan will be blown - and the card will be mounted on the mobo with the fan side facing downwards not upwards. This is standard practice.

 

The fact that the 960 doesn't have a grille doesn't necessarily pose a problem - the 900 series runs cooler than the earlier series cards due to lower power consumption (if I've understood things correctly) and therefore may not need the rear grille. What this means in practice is that you get a whole bunch of extra connectors on the back of the card.

Link to comment
Share on other sites

Lost out on the video card, but I'll be okay with the EVGA one.  Got a good combo deal going with the board and CPU I wanted.  It'll save me $35 overall.

 

Here's what I'm looking at:

 

EVGA 02G-P4-2966-KR GeForce GTX 960 Gaming 2GB 128-Bit GDDR5 PCI Express $ 210

 

Intel Core i7-4790K Haswell Quad-Core 4.0GHz LGA 1150

 

+

 

ASUS Z97-A LGA 1150 Intel Z97 HDMI SATA 6Gb/s USB 3.0 ATX Intel Motherboard 

 

Comboed at $445,

 

saving me $35 between them from "first look" costs of $335 for the graphics and $145 for the board = $480, and original costs of $350 + $150 = $500 or $55 from original.

 

Total: $655 for these components.

 

I think I'm happy with that.  Anyone find any major glaring errors there?

Link to comment
Share on other sites

Win 8.1 Pro works great, better than Win 7 Ultimate imho

 

I have had most operating systems installed for general gaming

and 8.1 with a couple of configuration clicks starts and runs the same as Win7

 

No driver issues and a great interface for apps and your old school desktop.

 

Added bonus for MSFFB2 users is you can go to desktop and back into IL2 1946 without loosing your FFB  :)

 

Have fun

Link to comment
Share on other sites

  • 2. Administrators

That's looking good Kira - now go push the damn button. (just kidding)

 

Re: Win8/8.1 - can't really say much about this due to lack of experience, but what I have seen I didn't like much. I don't see much point in my migrating to it now with Win 10 on the horizon and Win 7 working well for me but Kira - you should go for whatever you prefer. Just make sure you go for the 64-bit variant of whichever OS you decide to use.

Link to comment
Share on other sites

I would except for one last minor (major) question.  Power supply.  The calculators are saying 600W or so. One even ran as low as 500W, full load.  Others, 750W.  I've found an 850 Gold standard for a good deal (price of a 550 Bronze), yet I'm questioning if, being pretty much twice a padded recommendation, that might be waaayyy overboard, and hence the unit not running within optimum efficiency range, ie 20-90%.  Problem is that the calculator doesn't have peripherals for things like rudders, joysticks, Track IR, multiple monitors.  These are things I need to plan for now, yet I'm just not sure how much they pull.  I've been plugging other things (USB) into the calculator to get some idea, and I'm still running about 550-600 depending on the calculator.

 

Electricity just isn't my thing.

Link to comment
Share on other sites

  • 1. DDz Quorum

USB Peripherals don't take much juice, not at all compared to like a Video card.

 

Unless you are planning to hook up like 270 of those, PSU is never a problem...

 

The only thing you need to be aware of is that when you are using a USB hub, you get one that has its own power supply...

 

Do you remember what I posted earlier? 

Running Clod on an i5-16GB-NvidiaGTX580 - together with a secondary slower computer, powering all that, including three monitors.... total consumption 450W max.... this with a FFB stick, Bodnar board, Roccat Kave 5.1 Digital headset, PCTV ausb stick, samsung printer and whotnots....

 

No need to go overboard towards the 700W range...

 

Developments still shows that they get better performance using less and less power...

Link to comment
Share on other sites

Okay.  I was wondering about that.  It sounds like that 850 Gold was waaayyy overboard, even if the price was right. 

 

I was rather concerned coming up with those low numbers.  Kept forgetting the improvements in tech.  Basically, those numbers were odd, especially when considering the jump in capability of the processors, for just a tweak more power.   Intuitively, it makes no sense, yet the reality is that it really is that way.  I was looking at around 600W, probably will go with 650W just to be on the safe side of the safe side.

 

Right. Scratch, edit, delete, whatever.  Finally figured.

 

1. Estimates are just that: estimates.  Ballparks.  Fudgefactors.  Reality:  "Meh, we don't know, it's your system, you decide!"

 

2. Power systems have been known to supply far less than they say they do.

 

3. Old power supplies definitely supply less than they did when new.  This can happen in as little as a year.

 

4.  To hell with it, I've a good deal on a big arse power supply that's bound to be more than I need, but I won't ever leave my system wanting no matter how much I add to it.

 

Decision made!

Link to comment
Share on other sites

Beautiful computerarrow-10x10.png sir.  Why are you running it on a CRT???

 

LED monitors.  Is 1920x1080 really that much better than 1600x900?  I'm loosing a 1/2in in monitor size, but getting the higher resolution, for $10 more.  (Slightly) smaller monitor, better resolution, slightly higher price.

 

1600x900 is Acerarrow-10x10.png, 1920x1080 is AOC.

 

Could always run the 1920X1080 in 1600x900 if I wanted (yeah, like I'd really want to!)

 

Game-Debate.com gives the GTX 960 a 10 at 1600x900 and a 9.8 at 1920x1080.

 

http://www.game-debate.com/gpu/index.php?gid=2436&gid2=882&compare=geforce-gtx-960-2gb-vs-geforce-gtx-760

 

Decisions, decisions.  (Why am I debating $10 for a slightly smaller monitor with better performance that's still 4.5" bigger than my current one?  We've been here already...)

Link to comment
Share on other sites

And that's why I go here for questions.  There's always someone who knows the answer, once I get around to asking the right question.  Thank you, once again, Jabo.  You've been a big help with this whole thing, along with the other woofs who've had input.

 

Welp, I've officially splurged.  Went and got a Trackclip for wayyy to much.  Little delacate piece of plastic, then costarrow-10x10.png half again what it did for shipping! (GRRRR).  I'll find a way to get my camera setup to work on this new thinner monitor, but the old tracker just didn't seem to be working right.  Again, thanks to the Dog who got me hooked up with the original in the first place.  Worked like a charm when it did.

Link to comment
Share on other sites

  • 2 weeks later...

Well, the monitor arrived today.  Needless to say it looked good... until I turned it on.

 

Oh, it worked, all right, but the brightness was ridiculous, causing washout.  My reds are now oranges, and no amount of fiddling with brightness or contrast seems to make them red again, or my blues deep blue, or my blacks not grey. 

 

Any suggestions for something I'm missing?  Here are the specs:

 

Webpage:

 

http://www.bestbuy.com/site/acer-19-5-led-hd-monitor/8816192.p?id=1218902549934&skuId=8816192

 

http://us.acer.com/ac/en/US/content/model/UM.IS0AA.002

 

Screen Size 19.5"

Screen Mode HD+

Response Time 5 ms

Aspect Ratio 16:9

Horizontal Viewing Angle 170°

Vertical Viewing Angle 160°

Backlight Technology LED

HDCP Supported Yes

Panel Technology Twisted Nematic Film (TN Film)

Adjustable Display Angle Yes

Adjustable Display Height No

Adjustable Display Pivot No

Video Maximum Resolution 1600 x 900

Standard Refresh Rate 60 Hz

Color Supported 16.7 Million Colors

Contrast Ratio 100,000,000:1

Brightness 250 Nit (cd/m²)

Interfaces/Ports DVI Yes

VGA Yes

Power Description Input Voltage 110 V AC 220 V AC

Operating Power Consumption 14 W

Standby Power Consumption 490 mW

Off-Mode Power Consumption 320 mW

Physical Characteristics

Color Black

Height 13.1"

Width 18.3"

Depth 6.6"

Weight (Approximate) 4.40 lb

Weight with Stand (Approximate) 5.10 lb

 

I'm thinking that 250 Nit is the problem, but I'm not sure.  It falls into the category of one more thing I just don't know enough about to ask an intelligent question to get the answer I'm looking for.  I've come across a website talking about my old Gateway 2000 Vivitron 1776 CRT as about 170 Nit.

 

Fortunately, Best Buy seems to be better than Newegg when it comes to actually returning something you're unhappy with.  Called them up, and I can return it to a brick and mortar place rather than shipping it back.

 

Next up, try to find out the Nits/cd/m² of my old monitor.

Link to comment
Share on other sites

  • 1. DDz Quorum

My Samsung S24B150 has 200cd/m2 spec for brightness....

 

Brightness / Luminance

 

Brightness as a specification is a measure of the brightest white the TFT can display, and is more accurately referred to as its luminance. Typically TFT’s are far too bright for comfortable use, and the On Screen Display (OSD) is used to turn the brightness setting down. Brightness is measure in cd/m2 (candella per metre squared). Note that the recommended brightness setting for a TFT screen in normal lighting conditions is 120 cd/m2. Default brightness of screens out of the box is regularly much higher so you need to consider whether the monitor controls afford you a decent adjustment range and the ability to reduce the luminance to a comfortable level based on your ambient lighting conditions. Different uses may require different brightness settings as well." 

 

So, lower is better in this case.

Doesn't it have a reset to factory defaults setting/function?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...