General Question

Ame_Evil's avatar

What do I need to look for in a good gaming graphics card?

Asked by Ame_Evil (3051points) July 4th, 2010

Hello

I am looking for a cheap graphics card ($110 or £70 at the most) for my first gaming desktop PC. I have virtually no clue what I am looking for, so any help pointing me in the right direction is appreciated. I hope to be able to play any of the high end games (Crysis for example) without any problems.

If there is anything specific you need to know about the rig, let me know.

Thanks for the help.

Observing members: 0 Composing members: 0

42 Answers

Lightlyseared's avatar

With a graphics card it really depends on the resolution of the screen you use on how powerful a card you need. For example if your screen resolution 2600×1600 then even the most powerful cards out there will strugle to play crysis with all the settings turned up to maximum (although who cares as Crysis actually a pretty boring game once you get past the shock and awe of how much power it takes to run the damn thing).

If you are serious about using your PC for gaming the minimum I’d go for is Radeon HD 5770 which will run Crysis (just) on a 1,680×1,050 screen (typical resolution for 19” monitor) at high settings without AA but it’s about £120 which I realise is twice your budget.

Ame_Evil's avatar

mm, do you think something like http://www.dabs.com/products/xfx-ati-radeon-5550-hd-550mhz-1gb-pci-express-hdmi--includes-aliens-vs-predator--6ZMS.html would be better suited for my needs? I don’t want the perfectest of graphics tbh, I just want to be able to run the games decently lagfree and be able to see more or less clearly where I am going.

I hope to get an acer 22” monitor, which has resolutions at 1680×1050. Although I am tempted to get one with 1920×1080 pixels, but I can just reduce the resolution when playing games right?

Sorry if I sound like a complete idiot, because I am.

croatoan's avatar

Well the 5670 and the 5770 are on your price range (granted the 5770 is 40$ more than what you want to spend) but according to the reviews they play almost anything at the resolutions you mention. And yeah taking the resolution a bit down when you’re playing is a good way to gain a few frames back.

To give you an idea of how sometimes the whole graphic cards war is taken a bit out of proportion I currently have a radeon 4850 in my windows machine and I can play Dragon Age and Mass Effect 1 and 2 at 1680×1050 with high settings (only dialing down the anti aliasing setting a bit) lag free. And this is a card almost 2 years old. My recommendation would be spending 40 more on the 5770 and getting the 22 if you just want to game.

P.S – If you’re gonna run 4 gb of ram you’ll get even better results.

Ame_Evil's avatar

Mmk thanks very much for your advice :). Probably will see how much the rest of the machine costs before deciding. But i’m guessing spending the extra $40 will be worth it in the long run.

jerv's avatar

I am with @Lightlyseared here; most cards that cost more will have a bit of trouble with Crysis.

Personally, the only decent card I found in that range last time I looked was a GT 240. You really open up your options if you can spend $150 though.

ApolloX64's avatar

HD 5770 is the bare minimum when it comes to DX 11 gaming, the whole 57xx series and below is actually rather week in terms of performance due to a very low memory bandwidth compared to the HD 58xx series and even the mighty mouse GT 240 (I honestly don’t know how nVidia got a card that small to perform so friggin’ good.). A GTS 250 will still run almost anything and they are dirt cheap, and you can still get the HD 4850/4870 for super cheap before they disappear and they will run pretty much anything out there. Now, keep in mind that if you go for those cheaper options you sacrifice being able to use DX11 but from the sounds of it I don’t think you’d miss it because most of the DX11 features are only viable when you’re running an HD 5850/5870/5970 or GTX 460/470/480 at the moment.
Keep in mind, a good graphics card is only as good as the system supporting it.

Lightlyseared's avatar

I’d still go for a the HD5770. The HD 5550 is not really that powerful. I doubt it would run any recent game at acceptable settings at the native resolution of your monitor. While you can reduce the resolution to get a performance boost you will often end up playing games at no better than old an SD TV (and I mean really who wants that. Why buy a full HD monitor but be unable to play anything on it). The difference in price between that card and one that won’t leave you feeling disapointed and short changed is the price of one or two games. It would be better to save for a month or two and do it properly than buy a cheap card now and regret it later.

jerv's avatar

That brings up a good point; aside from improved multi-threading support (which they should’ve had five years ago!), what does DX11 really bring to the table?
I mean, some of the stills I’ve seen look a little better, but when I am playing a game, I am often too busy doing other things (like not getting killled) to bother counting the blades of grass in a front yard three blocks down.
Maybe it’s just that I am more interested in gameplay than in whacking off the the number of vertexes/second then my card can draw, but the way I see it, it’s not really worth getting any prettier than DX10. For me, as long as it doesn’t make me want to gouge my eyes out and can run smooth at all times (1600×900 resolution, medium detail, and a framerate that never drops below 30 even if the computer is having a seizure under load) then I am happy. If I have to drop resolution below that or I see framerates dip into the 20s even occasionally, then I cry.

Lightlyseared's avatar

@jerv It’s still early days for dx11 but there are some interesting features including multithreaded rendering and GPU computing.

jerv's avatar

@Lightlyseared But what for? I mean, I understand the desire for faster computing, but really. Are we going to start selling 10,240×7.680 monitors? Do we need framerates faster than the human brain can process? At least form a gaming perspective, I have to wonder how far we need to really go.

As for GPU computing, programmers have barely acknowledged that modern PCs have more than one core. Well, except for those that do Linux; there is a reason that most of your massively parallel computers use Linux. Is it any wonder that Microsoft hasn’t grasped something that bored European hackers have been doing for years?

ApolloX64's avatar

Actually going all the way back to XP, they already had the capability and programming options in for multi-cores thanks to the Xeon dual-cpu solutions from way back. It’s just that they were never actually activated properly until SP3. The problem lies in that most programs are written along a guideline of “What does our user have?” which is/was predominately single-core CPUs (Pentium 4s with hyperthreading do NOT count, I don’t care who says it’s “like having two cpus!” they can go f**k themselves) so most programs are never written with multi-threading in mind up until about four years ago or so. Games are even worse off thanks to the fact that most games are also made for consoles, therefor when it’s made on console first and then ported, it’s not written to take advantage of the myriad of hardware available to a PC enthusiast.
DX 11 offers a way out. It is a completely new version, even where DX 10 is considered. It gives a game programmer (or even just regular software) a whole new venue allowing them to map new functions to the GPU not previously available. Also Tesselation is a ridiculously cool, and if you have a DX 11 capable card go download the Heaven demo and you’ll see what I’m talking about.
Back when the HD 3800 series came out from AMD I remember them saying that if you wanted to, you could flash the software off the card and make it run your refrigerator if you wanted to. Well DX 11 makes this possible at the software level, not just at the hardware level. It incorporates all functions into one nice and neat package. Unlike DX 10 which was supplanted very quickly by 10.1 and now 11 because, as good as it was it was too constrained and complex to work with.

jerv's avatar

@ApolloX64 Yeah, XP brought NT to the mainstream, though it seems that even XP was a little shaky on SMP. (And I am with you on HT; while handy, it’s really more like two half-CPUs and last I checked, 2x(½)=1 and not >1) That is always an issue when a committee tries to figure out what consumers actually want/need. And I agree that Microsoft was more prepared than many other software writers, though it still seems odd to me that it’s taken/taking the others three generations of hardware to catch up.

It seems to me that the real issue here is that old dogs can’t learn new tricks. Or, more likely, they want to wait until the current software if so bloated that it really can’t run on modern hardware without multi-threading and didn’t bother to even lay the groundwork until pretty damn recently before making loads of money off of a new version that should’ve come out two years ago when Core2 Duos started running rampant. And since the Core i-series wasn’t exactly a surprise, you’d think that software makers would’ve been able to learn something from last year’s headlines and adapted accordingly by now.

I also think it funny that the hardest, most computationally intensive thing that we run on modern computers is games, and that a computer that can calculate the last digit of pi in under three seconds would still choke on Crysis.

ApolloX64's avatar

@jerv lol, completely right. Oddly enough, Crysis represents one of the only PC-only gaming engines made in years that truly stressed PC hardware and still does.
I love how the Core2 series of chips was rampant for several years until all of a sudden they just disappeared. It’s hard to find replacement chips or motherboards outside of eBay now. The Core i series makes no real sense to me, especially when it comes down to cost since switching requires a new motherboard and usually RAM since they all use DDR3. Ironically enough the Core2 series using LGA775 went through no less than 7 socket revisions, most of which weren’t even compatible with each other and now they expect users to shell out even more for a new socket upgrade yet again which they have of course. Whereas AMD has made possibly one of the greatest breakthroughs for their product line, since an AM3 cpu can run in any motherboard right down to an original AM2 (as long as it has a BIOS upgrade) and since the AM3 has an on-chip DDR2 & DDR3 controller it makes it so you can buy your CPU first then motherboard later.
I’m no fanboi but at least AMD has finally made some good decisions, especially since they’re Phenom X6 chips cost $300 and a Core i7 six core costs over $1000.

jerv's avatar

@ApolloX64 After watching the Socket Wars on the Intel side, I stuck with my old Northwood P4 for longer than I should’ve and then went right to an LGA-1156 Core i3. The way I see it, I don’t need top-tier stuff (like an LGA 1366 i7), wouldn’t pay for it myself if I did, and I can slap an i7–875K in there for about $300 (MSRP on it is $342) if I need to upgrade right now; probably cheaper when I feel the absolute necessity a ways down the road.

AMD is finally becoming relevant again, but it seems to me that they were playing a long game and as a result, they lagged behind Intel for quite a while. Let’s see if they realy can catch up. Too bad my Roommate’s AM2 system is (barely) too old to take an X6 :(

Lightlyseared's avatar

The reason Crysis don’t run that well on even brand spanking new PC’s with heavily overclocked CPU’s and graphics cards that are so fast and hot you could fry bacon on them is that it is badly written, and badly optimised. It is also boring.

ApolloX64's avatar

@jerv damn right, we’ll have to wait and see how AMD plays their new hand in this one. They have a trump in the form of ATI thankfully, but time will tell.

@Lightlyseared Actually Crysis is a rather elegant piece of programming, especially since it’s out of Croatia. And as to your comment that “graphics cards that are so fast and hot you could fry bacon on them” that’s not entirely true. The average mid range video card operates at about 40C during regular idle and up to 70C under heavy load. A high end card will often operate at about 35–45C under idle, and up to 90C under 100% load. Rewind to 2005 and let’s introduce nVidia’s GeForce FX 5800 Ultra (oh yeah, that’s right nVidia, I remember your little mistake) which had a whopping maximum operating temp of 145C and needed a leaf-blower-esque fan to keep it under 120C at the best of times. To date, it is the hottest, most expensive to produce and most useless enthusiast level video card ever produced. Right up there with nVidia’s 9800 GX2, the card that got so hot it often melted it’s PCBs or fused it’s own circuitry resulting in one of the highest RMA rates ever for a video card that cost over $600. I speak from first-hand knowledge on these btw.
In short, modern graphics cards are amazingly well designed, with a theoretical thermal operating ceiling that is on average about 175C-190C which they will never reach. It’s not the hardware and it’s usually not the software that is the issue, it’s a combination. A strong high-end CPU paired with high latency cheap RAM on an expensive enthusiast motherboard with a cheap low budget video card will chop just as much as an out of date CPU paired with a cheap motherboard and enthusiast RAM with a GTX 470. It’s all about the whole picture when it comes to PC gaming, which is why so many people have switched to Consoles like the Xbox 360 and PS 3 and I can’t say I blame them considering how complex it has become.

jerv's avatar

@ApolloX64 “It’s all about the whole picture when it comes to PC gaming, which is why so many people have switched to Consoles like the Xbox 360 and PS 3 and I can’t say I blame them considering how complex it has become.”
The only real complexity I see is the new naming schemes, especially on video cards and Intel CPUs. To me, it seems like just common sense to match your shit. Part of my decision to go with the GT240 had to do with my relatively low-end CPU and good-but-not-great mobo and RAM. I find people that cram a $600 card into a mismatched Scheißeboxen are just dumb. After installing this card, my components are well-balanced overall and that is what I was shooting for.

Then again, maybe I consider it simple/common sense just because I had no problem learning this stuff back in grade school! The fact that modern-day consumers are confused by it is, I feel, more a reflection on the intellectual laziness running rampant in our society than in anything else.

And I thought my old Radeon 9800 XT was bad with it’s 63C idle and load temps that tripped the rather loud overtemp alarm on the motherboard after a few minutes of gaming regardless of how I did the case fans, but at least I never melted shit. Hit thermal shutdown a few times (5C above the alarm temp) though….

Ame_Evil's avatar

Oh I had a further question regarding graphics cards, can you run two monitors on the cards mentioned? Or will I need to buy a further graphics card to run another monitor?

ApolloX64's avatar

Yes, you can run two monitors on all the cards mentioned here. On ATI 5xxx series cards you can actually run three montiors using EyeFinity.
In fact, pretty much any card post 2004 is capable of running dual monitors.

Lightlyseared's avatar

@ApolloX64 Crysis is no way shape or form an elegant piece of programming and I don’t give a flying fuck where it’s from. It makes the same mistakes as Farcry, it’s boring to play, the suit functions obviously didn’t get play tested at any point during development. Also if you think modern graphics cards don’t get hot then I DARE you to touch the heat pipe on GTX 480 while it’s running crysis (at 10 FPS incidentently compared to 45 FPS for Bad Company 2)

actually don’t touch the heat pipe as I ended up with a second degree burn

ApolloX64's avatar

@Lightlyseared Easy there, you’re welcome to your opinion, I’m just stating my own as well as some facts. No reason to attack me for that. And I wasn’t saying that modern cards do not get hot, it’s that they are far less prone to overheating issues than older cards. Yes, I have touched the heat pipe on a GTX 480 after it has been running a bench test for over two hours. It hurt about the same as touching the heat pipe on the old HD 2900 XT when it had been idling for two hours. Same goes for the 9800 GX2.
And just a concern; if you’re running Crysis at 10 FPS and BC2 at 45 FPS on a GTX 480, then you have some serious issues to work out with your system or those drivers that nVidia are putting out are more buggy than I had previously thought.

Ame_Evil's avatar

Hey guys, I decided to take your wise words into account and have selected to chose one of the following graphic cards.

Now my unwise mind says go for the sapphire 5770 because it has 2gb instead of the gts 250 which has 1gb. I read on yahoo answers that 5770 is slightly faster than the gts. Is this the better choice when they are the same price (£105 – $160)? Here are the links of the items for reference:

Sapphrie ATI Radeon 5770 2gb: http://www.dabs.com/products/sapphire-technology-ati-radeon-5570-hd-650mhz-2gb-pci-express-hdmi-6ZQ7.html

Nvidia Geforce GTS 250 1gb: http://www.dinopc.com/shop/pc/nvidia-geforce-gts-250-1gb-82p526.htm?atc=1&idproduct=526

jerv's avatar

@Ame_Evil I would go with the Radeon 5770. It’s not that it has more memory (though that helps), but rather because it actually is just plain superior to the GTS 250 even if they had the same amount. In fact, the 5770 is plain old decent at any price.
(The GTS250 is just off the bottom of my screen when I tested that link. That should tell you something ;) .

Ame_Evil's avatar

Can I ask for clear curiosity what does the extra 1gb of memory actually do? Is the difference between 1gb and 2gb noticeable, for example?

”(The GTS250 is just off the bottom of my screen when I tested that link. That should tell you something ;) .”

What do you mean by that?

jerv's avatar

The simple way of putting it is that more memory allows it to remember more detail like vectors, Z-buffering, etcetera. I practical terms though, there really isn’t much benefit to choosing a 2GB card over a 1GB card all else being equal.

However, in this case, all else is not equal.

That link I posted was a list of GPUs with the fastest, most powerful ones at the top and the Radeon 5770 highlighted. Now, if you have a list with fast stuff on top and the GTS250 is far enough below the 5770 that I had to scroll down a hair to see it on my screen… get it now?

Ame_Evil's avatar

@jerv Oh I must have missed the link somehow. Anyways thanks for all the advice. GAs all around!

Ame_Evil's avatar

Oh crap that is a 5570, completely misread the numbers.

jerv's avatar

BIG difference! In that case, the GTS250 is a better bet even with only 1GB. The Radeon 5570 is a couple of small notches below my GT240. In fact, I found my card on Dabs for £89.14 so you can see. It does well enough at the games I run (smooth framerates at 1600×900 with high detail) and is dead silen (which was why I went with it instead of a more powerful card) but if sheer power is more important to you, then get the GTS250… though the link you posted has a sale price so act fast ;)

Ame_Evil's avatar

I found another card based on that link you gave me: the radeon 5750. I found two versions of it, one selling at under £94, and another at under £105. I was wondering if it was a good choice for the price, and whether it would be better to get one over the other

http://www.pixmania.co.uk/uk/uk/5998023/art/pixmania/radeon-hd-5750-1-gb-gddr5.html?srcid=867

http://www.amazon.co.uk/Sapphire-HD5750-GDDR5-Express-Graphics/dp/B002RXM5WE/ref=sr_1_1?ie=UTF8&s=electronics&qid=1279331173&sr=8-1

Thanks for all the help once again :) We are nearing the end hopefully

jerv's avatar

Well, the fact that I have heard of Sapphire (they’ve been around for years) but never heard of Pixmania until now biases me a little bit, but I am currently more curious as to what you plan to do about the power supply.

jerv's avatar

NNngggghhhh….

I would like to hear some reviews from people who have had one of them for not less than five years and would like to see more specs on it (especially a power/temperature graph) before I would trust it. Then again, I am a bit fussy.

There is also the more pressing problem that many high end video cards (including, if that pic is correct, the Sapphire HD 5750) require a 6-pin PCI-e connector. The PSU you are considering has zero of them. It also has only 2 SATA connectors, so you can forget about having multiple SATA hard drives if/when you decide to upgrade. I would look for something that would actually allow you to power that video card.
Also, considering that computer makers are moving from IDE drives to SATA drives, trading out 4-pin Molex connectors for SATA connectors is probably a good idea as well. My current motherboard can’t even take an IDE drive, so the only use I have for Molex connectors (of which, I have three) is for case fans. My drives are SATA, and I could hook up six drives (enough for a four-drive RAID array and dual optical drives) without running out of either connectors or watts.

I have a 600W Cooler Master eXtreme Power Plus . Cooler Master has a solid reputation with a long track record. Sure, it cost me $75 on sale, but it’s also going to outlast the system I have, the system I replace it with, and possibly the one after that, so I consider a strong, reliable PSU to be an investment. Hell, the old Thermaltake 425W I had is still going strong after quite a few years. (I say “had” since my friend now has that computer.)

Of course 600W is likely overkill for you. It’s actually about double what I need at the moment, but it was also on sale for less than the 400W and gives me room to expand so I went for it. To find what size you should get, try this calculator but don’t be afraid to go a bit higher like I did, especially if you plan to upgrade the thing over time.

Ame_Evil's avatar

Fine, how about this PSU? I dislike paying double, but meh.

http://www.pyramid.com/Antec_/Antec_BP500U_500_Watt_Power_Supply_Unit.aspx?utm_source=Google&utm_medium=Google%2BProducts&utm_campaign=Google%2BProducts

P.S. Do you have msn, may be quicker to talk about these things :D.

jerv's avatar

Antec will last you quite nicely. It’s what I used in my 3GHz AMD rig.

I don’t know what sort of power draw you are actually looking at ( I didn’t add it up myself) but that site also has the Antec 430W for £34.77 and the Thermaltake 450W for £39.87 and you could probably do fine with those. Hell, in it’s current state, those would power my rig with ease since I only really need ~280W.

However, I wonder a bit about that Pyramid place and wonder if the UK doesn’t have someplace better and possibly cheaper. If you were here in my neck of the woods, I would just take you don’t to Winston’s place (where I got my last Thermaltake) or Re-PC and let you go nuts there. Hell, even Fry’s is both better and cheaper : that 430W Thermaltake is only £26.14 outright and £19.60 after rebate. Even with the 9.5% WA and King County sales tax, that is still cheaper, so unless VAT is more onerous than I remember, I think you you can do better price-wise.

I do have MSN, but rarely the free time. Definitely not the time today, but maybe tomorrow. Of course, I am also 8 hours behind you and that may make things odd to schedule…

Ame_Evil's avatar

Great – graphics card is busto. Have sent a support ticket to Sapphire but not expecting anything except to return it. The screen keeps turning on and off and eventually a “VPU recovery has reset your graphics accelerator” error comes up and an error report. After this it offers me to restart the computer, but the same thing happens again.

Ame_Evil's avatar

I have looked at other graphics cards as I do not feel I want a replacement Sapphire one. I have found this http://www.overclockers.co.uk/showproduct.php?prodid=GX-124-XF&tool=3 HD 4890 which is apparently good and better than the HD 5750 on http://www.videocardbenchmark.net/video_lookup.php?cpu=Radeon+HD+5770 and in other reviews but only ~£20 more expensive. Should I get that instead if I have to replace it?

jerv's avatar

Let me get back to you when I am on something with a full screen, a real keyboard and multiple tabs. An iPod on wifi has it’s limitations.

Ame_Evil's avatar

http://www.shrani.si/?u/Ap/TyHXxGC/31082009031.mp4"

Is pretty much what is happening to my PC/monitor at this moment in time. I have no idea what is causing it, and this guy is having the same problem (but got no replies whatsoever). I have trailed through many many forums and have tried what they suggested to no avail.

Do you think it could be a problem with the PSU or perhaps MOBO? I suspect it could be the PSU as it is possibly 2–3 years old and from ebay ._. And I am suspecting the MOBO in case we damaged it when building the PC somehow. I haven’t installed anything else on the computer besides the wireless adapator and mozilla.

Also I can no longer reformat my hard drive and install Windows XP again because as soon as its finished looking for all the files and ready to set up it gets a blue screen of death.

Starting to wish I never took up this project.

At the moment I am waiting on support from Sapphire, and my friend to give me a free copy of Windows 7 to try to see if that fixes things.

jerv's avatar

First off, I had to edit that link since the quote mark at the end made my browser think it was something else and gave me a 404.

From what I saw though. I am thinking more along the lines of an issue with the video drivers. That card is newer than XP, so XP won’t really be able to handle any Radeon made in the last few years unless/until you install the newest version of Catalyst It also looks like the card was trying modes that the monitor doesn’t support; I had similar issues when I first got my LCD monitor. And unless you change a couple things in the Control Panel, changing video modes in XP may require a restart every time you switch resolutions. Still, moving that window at the end of the clip was jerky, but I get that for a moment after my system starts up too, as well as when I have heavy tasks in th background; that is why I wait for everything to finish loading and my CPU load to drop before I try using it.

I also question the ability of WinXP to deal with other “new” hardware like SATA. Are you running the old, basic XP, or is it SP2 or SP3?

It’s also entirely possible that the problem is that copy of WinXP, but without knowing why it went into a BSOD, I cannot be sure. Memory errors due to bad DIMMs or mobo damage can cause a BSOD, but Windows seems to do that often enough on it’s own even on good hardware that I am going to play the odds and say it’s probably not hardware damage.

PSUs are too rugged to consider here. Basically, if it turns on without the smell of something burning, the PSU is good. Every PSU I’ve ever known either functions as designed (good for a quality PSU, bad for a Brand X) or doesn’t work at all, and most of the latter smell burnt.

As for the 4890, the only thing I have against it is that it is ~£20 more expensive. Then again, the speed difference is bigger (percentage-wise) than the price difference, so it is a better value. I say, if you can spare the £20 without starving, then go for it!

Ame_Evil's avatar

@jerv My windows are not jerky like his afterwards, and I can change my screen resolution still after the VPU error up to the maximum the card supports (1900×1060).

The WinXP was a torrent http://thepiratebay.org/torrent/5199210/WINDOWS_XP_PRO_32-BIT_SP3_ISO_ACTIVATED_GENUINE just to make sure everything was working. It came with SP3 preinstalled and I had installed the OS three times before the BSOD started to appear and has never vanished since trying to use it.

I shall tell you the results of attempting to use this graphics card with Windows 7 or whether Sapphire contacts me.

The PSU doesn’t smell of smoke and I hardly doubt anything is overheating in the case that I bought. I just thought that the PSU may still appear to be working but not supplying enough power to the GPU thus causing the error.

Atm my fears are that I cannot install Windows 7 or that if I can the problems will continue over. And my ultimate fear is that if I get a new GPU that the problems will persist :x.

jerv's avatar

Have you got the latest veriosn of Catalyst though? My Radeon 9800XT got a pretty good boost in performance and compatibility after I did so. Also, my old Northwood P4 and that 9800 XT drew more power than your rig and had a smaller (400W) PSU, so I doubt a lack of juice is your problem.

I think Win7 will help a bit too.

Ame_Evil's avatar

Yeah I downloaded the latest one from the website – version 10.7. I also tried 10.6 and 10.5 as well as the one that came with the CD (no idea what number).

Answer this question

Login

or

Join

to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther