Why is the RandomX algorithm being hyped to the moon?
TL;DR: don't assume the average return from mining RandomX will be higher than the current CryptonightR algorithm. Hold back your excitement for now. I think we all need to bring something to our attention. Over the last month, there have been so many topics and comments here on MoneroMining about the new 'RandomX' algorithm. This algorithm is supposed to be launched a couple of months from now. There are many questions like "is this a good hashrate for my CPU"? "What's your power usage on RandomX"? "How can I tune my CPU for RandomX"? "How would the algorithm perform on this hardware"? I think these are great constructive comments that are at the heart of what miners stand for. We miners love optimizing our rigs and educating ourselves on technological trends. But I've noticed many questions such as "what parts should I buy for a RandomX mining rig"? "Is an AMD Ryzen 9 3900x a good investment"? "What parts will give me the most profit when RandomX launches"? Many of these questions are asked with very little research. I think there's a gold fever brewing behind some of these comments. The kind of motives that have bankrupted many miners in the past bubbles. As we have seen in 2014 and 2018, anybody who enters the crypto industry with an 'I want easy profit' attitude almost always goes bankrupt. They buy coins or hardware at the peak of the bubble. Sometimes they get lucky and sell their coins or rigs right before the crash (only to get burned in a future bubble later). But most of the time, these new users lose most of their investment. As a veteran miner, a lot of alarm bells ring in my head when I read these kinds of RandomX hype posts. I have no reason to think CPU mining will be more profitable on RandomX than on the current CryptonightR.
If the new AMD CPUs are very efficient on RandomX, that just means more people will buy them, driving up the difficulty. Your shiny R9 3900x's profit will start falling because it's no longer as competitive against the other hardware on the network.
If the profits on day 1 of the RandomX launch are indeed high, more people will start adding rigs to the network. If the average miner's profit is above the equilibrium of the market, it will start going down. That equilibrium is largely set by botnets, large scale farms in China/Russia/Niagara Falls/Georgia, and datacenters with spare capacity. So if your R9 3900x earns $10/day on day 1, you can count on that golden streak ending soon.
CPU mining as a market is never stable. Your CPU rig is limited tojust 1 or 2 coins: Monero and Veruscoin. Edit: there are a few more CPU coins than these. AMD GPUs can at least mine 3 or 4 coins well, while nVidia GPUs are the best at 5-10 different algorithms. GPU mining is a safer, less risky investment. GPU mining is like playing blackjack. Building a rig specifically for CPU mining is like tossing a coin. You're locked into one coin by building a CPU rig. Yes, it has resale value to gamers, but it's much harder to resell a MOBO combo than a bunch of GPUs at any price. Trust me, I've sold hundreds of GPUs and dozens of MOBOs before!
I don't know what the market share of CPUs vs. GPUs on CryptonightR is right now. But if most of the current nethash is made up of CPUs, these CPUs will have no choice but to switch to RandomX when it is out. There's no other coin for them to mine, unless they have some work to do outside of mining. So almost all of them will get onto the RandomX network, too, along with your expensive new CPU rig. I think this'll be the biggest factor driving up difficulty. Yes, the older CPUs might not be as efficient as the new Ryzens, but many of them are already paid for in terms of capital (like in a datacenter) or have free power (like in a botnet or apartment with free power).
You might say that Monero will always be profitable enough because it has survived so long, or the developers are better, or they're taking action against ASICs. But that doesn't necessarily guarantee profit. Monero might be a successful coin and overtake ETH, but that has nothing to do with profit on the network. Even though Bitcoin's really successful, you're guaranteed to lose money if you buy the latest Antminer and run it at residential power rates. Meanwhile, Dogecoin back in the day had awesome profits even though it was a blatant fork of LTC with few improvements.
Your new RandomX rig might look like it has decent "ROI" to you, but that doesn't mean it was the best investment. You might have been better off building a GPU rig and mining Grincoin or Ravencoin. I.E. if you build a RandomX rig, you're earning less profit for the same amount of capital invested. And even if you earn the same return, you still took a higher risk than if you built a GPU rig (see the point above).
In the GPU mining community, I have the feeling that there's a lot of resentment over the 2018 crypto recession and the whole 'ASIC miner invasion'. I think people here are feeling burned over their losses last year and the evil ASIC takeover, and want an opportunity for the little guy to start mining again. So we're falsely seeing the RandomX ray of hope as a floodlight, and getting overexcited. And in general, the ordinary person cannot make a significant, steady profit in the crypto mining industry. The guy who wrote that thread is very rich and even 100 GTX 1080 Ti's cost nothing to him. The reason he became wealthy is because he avoided get-rich-quick gimmicks back in the day (like the dotcom sites) and focused on learning technology for the future. Mining will not make you rich, and especially not RandomX coin tossing. If you love RandomX, build your rig now, keep benchmarking and undervolting and have fun at it. But if you just want profit, wait until RandomX is up and running. And consider all the risks involved with a new algorithm and commercial mining in general. So I hope we can all reconsider whether we're excited about RandomX for the right reasons. Let's try to avoid jumping to conclusions about profitability and hold off on the Newegg 'checkout' button. Even though 12 cores at 70 watts sounds awesome. Happy mining!
I literally have tens of thousands of dollars in top-shelf hardware, looking to repurpose some before selling on eBay to build a NAS system, possibly a dedicated firewall device as well. o_O
Q1) What will you be doing with this PC? Be as specific as possible, and include specific games or programs you will be using.** A1) This will be a dedicated NAS system for my home network. As such, I'm looking to have it: - Host ##TB's of 720, 1080 & up resolution Movies and TV Shows I'm about to begin ripping from a MASSIVE DVD & Blueray collection I have. - My kids are big on Minecraft. I understand it's possible to host your own "worlds" (or whatever they call the maps you can build) on your own "server". I think it would be pretty neat to offer them (& their friends - if can be done 'safely/securely') their own partition on one of my NAS HDD's. - I also have accounts with a couple diff VPN companies... I understand it's possible (?) to sync said VPN's with a NAS, this might be a more relative topic on the next point/purpose... - I'd like to be able to remotely link to this NAS for when I travel overseas and want to stream at my temp location from my house/this NAS. ______________________ Q2) What is your maximum budget before rebates/shipping/taxes?** * A2) Here's where I make matters more complicated than most others would... I've been an advocate for Bitcoin and crypto-currencies in general since 2013. I invested in a small mining outfit back in 2014 (strictly Bitcoin/ASIC's). One of my buddies is the President of a large-scale mining operation (foreign and domestic) and he convinced me to dabble in the GPU mining-space. I made my first hardware purchase in Q4, 2017 and launched a small-scale GPU-Farm in my house since then. I had the rigs mining up until Q3 of 2018 (not cost-efficient to keep on, especially living in SoFlo) and since then, the hardware's been collecting dust (& pissing off my family members since they lost access to 3X rooms in the house - I won't let anyone go near my gear). One of my New Years Resolutions for 2019 was to clear out the house of all my mining equipment so that's all about to go up on eBay. So "budget" is relative to whatever I "MUST" spend if I can't repurpose any of the parts I already have on hand for this build... (Anyone having something I "need" and is looking to barter for one of the items I'll list later on in here, LMK). ______________________ Q3) When do you plan on building/buying the PC? Note: beyond a week or two from today means any build you receive will be out of date when you want to buy.** A3) IMMEDIATELY! :) ______________________ Q4) What, exactly, do you need included in the budget? (ToweOS/monitokeyboard/mouse/etc\)** A4) Well I had a half-assed idea approximately 1 year ago that it might be wise to build a bunch of 'gaming rigs' to sell on eBay with my intended repurposed mining hardware so I went on a shopping spree for like 6 months. That said; I've got a plethora of various other components that aren't even unboxed yet. 90% of the items I've purchased for this additional project were items that were marked down via MIR (mail-in-rebates) & what-not...
AFAIK, there are only 3X items I absolutely do not have which I 'MUST' find. Those would be - 1) Motherboard which accepts "ECC RAM". 2) CPU for said MOBO. 3) Said "ECC RAM".\*
______________________ Q5) Which country (and state/province) will you be purchasing the parts in? If you're in US, do you have access to a Microcenter location?** A5) I'm located in Southwest Florida. No Microcenter's here. Best Buy is pretty much my only option although I am a member of Newegg, Amazon & Costco if that makes any difference? ______________________ Q6) If reusing any parts (including monitor(s)/keyboard/mouse/etc), what parts will you be reusing? Brands and models are appreciated.** A6) In an attempt to better clean up this Q&A, I'm going to list the items I have on-hand at the end of this questionnaire in-case passers-by feel like this might be a TLDR.* (Scroll to the bottom & you'll see what I mean). ______________________ Q7) Will you be overclocking? If yes, are you interested in overclocking right away, or down the line? CPU and/or GPU?** A7) I don't think that's necessary for my intended purpose although - I'm not against it if that helps & FWIW, I'm pretty skilled @ this task already (it's not rocket science). ______________________ Q8) Are there any specific features or items you want/need in the build? (ex: SSD, large amount of storage or a RAID setup, CUDA or OpenCL support, etc)** A8) As stated in A4; ECC RAM is non-negotiable... RAID seems like a logical application here as well. - This will predominantly be receiving commands from MacOS computers. I don't think that matters really but figured it couldn't hurt to let you guys know.\* - I'd also be quite fond of implementing "PFSENSE" (or something of that caliber) applied to this system so I could give my Netgear Nighthawks less stress in that arena, plus my limited understanding of PFSENSE is that it's ability to act as a firewall runs circles around anything that comes with consumer-grade Wi-Fi routers (like my Nighthawks). Just the same, I'm open to building a second rig just for the firewall.\* - Another desirable feature would be that it draws as little electricity from the wall as possible. (I'm EXTREMELY skilled in this arena. I have "Kill-A-Watts" to test/gauge on, as well as an intimate understanding of the differences between Silver, Gold, Platinum and Titanium rated PSU's. As well as having already measured each of the PSU's I have on-hand and taken note of the 'target TDP draw' ("Peak Power Efficiency Draw") each one offers when primed with X amount of GPU's when I used them for their original purpose.\* - Last, but not least, sound (as in noise created from the rig). I'd like to prop this device up on my entertainment center in the living room. I've (almost) all of the top-shelf consumer grade products one could dream of regarding fans and other thermal-related artifacts. - Almost forgot; this will be hosting to devices on the KODI platform (unless you guys have better alternative suggestions?) ______________________ Q9) Do you have any specific case preferences (Size like ITX/microATX/mid-towefull-tower, styles, colors, window or not, LED lighting, etc), or a particular color theme preference for the components?** A9) Definitely! Desired theme would be WHITE. If that doesn't work for whatever reason, black or gray would suffice. Regarding "Case Size". Nah, that's not too important although I don't foresee a mini-ITX build making sense if I'm going to be cramming double digit amounts of TB in the system, Internal HDD's sounds better than a bunch of externals plugged in all the USB ports. ______________________ Q10) Do you need a copy of Windows included in the budget? If you do need one included, do you have a preference?** A10) I don't know. If I do need a copy of Windows, I don't have one so that's something I'll have to consider I guess. I doubt that's a necessity though. ______________________ ______________________ ______________________ **Extra info or particulars:*\* AND NOW TO THE FUN-STUFF... Here's a list of everything (PARTS PARTS PARTS) I have on-hand and ready to deploy into the wild &/or negotiate a trade/barter with: CASES - Corsair Carbide Series Air 540 Arctic White (Model# Crypto-Currency-9011048-WW) - (Probably my top pick for this build). Cooler Master HAF XB EVO (This is probably my top 1st or 2nd pick for this build, the thing is a monster!). Cooler Master Elite 130 - Mini ITX - Black Cooler Master MasterBox 5 MID-Tower - Black & White Raidmax Sigma-TWS - ATX - White MasterBox Lite 5 - ATX - Black w/ diff. Colored accent attachments (included with purchase) NZXT S340 Elite Matte White Steel/Tempered Glass Edition EVGA DG-76 Alpine White - Mid Tower w/ window EVGA DG-73 Black - Mid Tower w/ window (I have like 3 of these) ______________________ CPU's - ***7TH GEN OR BELOW INTEL's ("Code Name Class mentioned next to each one)**\* Pentium G4400 (Skylake @54W TDP) - Intel ARK states is "ECC CAPABLE" Celeron G3930 (Kaby Lake @ 51W TDP) - Intel ARK states is "ECC CAPABLE" :) i5 6402P (Skylake @65W TDP) - Intel ARK states is "NOT ECC CAPABLE" :( i5 6600k (Skylake @ 91W TDP) - Intel ARK states is "NOT ECC CAPABLE" :( i7 6700 (Skylake @ 65W TDP) - Intel ARK states is "NOT ECC CAPABLE" :( i7 7700k (Kaby Lake @ 95W TDP) - Intel ARK states is "NOT ECC CAPABLE" :( ***8TH GEN INTEL's **\* i3-8350K (Coffee Lake @91W TDP) - Intel ARK states is "ECC FRIENDLY" :) I5-8600K (Coffee Lake @95W TDP) - Intel ARK states is "NOT ECC CAPABLE" :( ***AMD RYZEN's **\* Ryzen 3 2200G Ryzen 5 1600 Ryzen 7 1700X ______________________ MOTHERBOARDS - ***7TH GEN AND BELOW INTEL BASED MOBO'S - **\* MSI Z170A-SLI ASUS PRIME Z270-A ASUS PRIME Z270-P ASUS PRIME Z270-K EVGA Z270 Stinger GIGABYTE GA-Z270XP-SLI MSI B150M ARCTIC MSI B250M MICRO ATX (PRO OPT. BOOST EDITION) ***8TH GEN INTEL BASED MOBO'S - **\* EVGA Z370 FTW GIGABYTE Z370XP SLI (Rev. 1.0) MSI Z370 SLI PLUS ***AMD RYZEN BASED MOBO'S - **\* ASUS ROG STRIX B350-F GAMING MSI B350 TOMAHAWK MSI X370 GAMING PRO ASROCK AB350M PRO4 ______________________ RAM - Way too many to list, nothing but 4 & 8GB DDR4 sticks and unfortunately, none are ECC so it's not even worth mentioning/listing these unless someone reading this is willing to barter. At which time I'd be obliged to send an itemized list or see if I have what they're/you're specifically looking for.\* ______________________ THERMAL APPLICATIONS/FANS - JUST FANS - BeQuiet - Pure Wings 2 (80mm) Pure Wings 2 (120mm) Pure Wings 2 (140mm) Silent Wings 3 PWM (120mm) NOCTUA - PoopBrown - NF-A20 PWM (200mm) Specifically for the BIG "CoolerMaster HAF XB EVO" Case GREY - NF-P12 Redux - 1700RPM (120mm) PWM Corsair - Air Series AF120LED (120mm) CPU COOLING SYSTEMS - NOCTUA - NT-HH 1.4ml Thermal Compound NH-D15 6 Heatpipe system (this thing is the tits) EVGA (Extremely crappy coding in the software here, I'm like 99.99% these will be problematic if I were to try and use in any OS outside of Windows, because they barely ever work in the intended Windows as it is). CLC 240 (240mm Water-cooled system CRYORIG - Cryorig C7 Cu (Low-Profile Copper Edition*) A few other oversized CPU cooling systems I forget off the top of my head but a CPU cooler is a CPU cooler after comparing to the previous 3 models I mentioned. I almost exclusively am using these amazing "Innovation Cooling Graphite Thermal Pads" as an alternative to thermal paste for my CPU's. They're not cheap but they literally last forever. NZXT - Sentry Mesh Fan Controller ______________________ POWER SUPPLIES (PSU's) - BeQuiet 550W Straight Power 11 (GOLD) EVGA - 750P2 (750W, Platinum) 850P2 (850W, Platinum) 750T2 (750W, TITANIUM - yeah baby, yeah) ROSEWILL - Quark 750W Platinum Quark 650W Platinum SEASONIC - Focus 750W Platinum ______________________ STORAGE - HGST Ultrastar 3TB - 64mb Cache - 7200RPM Sata III (3.5) 4X Samsung 860 EVO 500GB SSD's 2X Team Group L5 LITE 3D 2.5" SSD's 480GB 2X WD 10TB Essential EXT (I'm cool with shucking) + 6X various other external HDD's (from 4-8TB) - (Seagate, WD & G-Drives) ______________________ Other accessories worth mentioning - PCI-E to 4X USB hub-adapter (I have a dozen or so of these - might not be sufficient enough &/or needed but again, 'worth mentioning' in case I somehow ever run out of SATA & USB ports and have extra external USB HDD's. Although, I'm sure there would be better suited components if I get to that point that probably won't cost all that much). ______________________ ______________________ ______________________ Needless to say, I have at least 1X of everything mentioned above. In most all cases, I have multiples of these items but obviously won't be needing 2X CPU's, Cases, etc... Naturally, I have GPU's. Specifically; At least 1X of every. Single. NVIDIA GTX 1070 TI (Yes, I have every variation of the 1070 ti made by MSI, EVGA and Zotac. The only brand I don't have is the Gigabyte line. My partners have terrible experience with those so I didn't even bother. I'm clearly not going to be needing a GPU for this build but again, I'm cool with discussing the idea of a barter if anyone reading this is in the market for one. I also have some GTX 1080 TI's but those are already spoken for, sorry. It's my understanding that select CPU's I have on this list are ECC Friendly and AFAIK, only 1 of my MOBO's claims to be ECC Friendly (The ASROCK AB350M PRO4), but for the life of me, I can't find any corresponding forums that confirm this and/or direct me to a listing where I can buy compatible RAM. Just the same, if I go w/ the ASROCK MOBO, that means I'd be using one of the Ryzens. Those are DEF. power hungry little buggers. Not a deal-breaker, just hoping to find something a little more conservative in terms of TDP. In closing, I don't really need someone to hold my hand with the build part as much as figuring out which motherboard, CPU and RAM to get. Then I'm DEFINITELY going to need some guidance on what OS is best for my desired purpose. If building 2X Rigs makes sense, I'm totally open to that as well... Rig 1 = EPIC NAS SYSTEM Rig 2 = EPIC PFSENSE (or the like) DEDICATED FIREWALL Oh, I almost forgot... The current routers I'm using are... 1X Netgear Nighthawk 6900P (Modem + Router) 1X Netgear Nighthawk X6S (AC 4000 I believe - Router dedicated towards my personal devices - no IoT &/or Guests allowed on this one) 1X TP-Link Archer C5 (Router). Total overkill after implementing the Nighthawks but this old beast somehow has the best range, plus it has 2X USB ports so for now, it's dedicated towards my IoT devices. ---- I also have a few other Wi-Fi routers (Apple Airport Extreme & some inferior Netgear's but I can only allocate so many WiFi Routers to so many WiFi channels w/out pissing off my neighbors) On that note, I have managed to convince my neighbors to let me in their house/WiFi configuration so we all have our hardware locked on specific, non-competing frequencies/channels so everyone's happy. :) Please spare me the insults as I insulted myself throughout this entire venture. Part of why I did this was because when I was a kid, I used to fantasize about building a 'DREAM PC' but could never afford such. To compensate for this deficiency, I would actually print out the latest and greatest hardware components on a word document, print the lists up & tape to wall (for motivation). I was C++ certified at the age of 14 and built my first PC when I was 7. At the age of 15 I abandoned all hope in the sector and moved on to other aspirations. This entire ordeal was largely based off me finally fulfilling a childhood fantasy. On that note = mission accomplished. Now if I'm actually able to fulfill my desires on this post, I'm definitely going to feel less shitty about blowing so much money on all this stuff over the last couple years. TIA for assisting in any way possible. Gotta love the internets! THE END. :) EDIT/UPDATE (5 hours after OP) - My inbox is being inundated with various people asking for prices and other reasonable questions about my hardware being up for sale. Not to be redundant but rather to expound on my previous remarks about 'being interested in a bartetrade' with any of you here... I did say I was going to sell my gear on eBay in the near future, I also said I wanted to trade/barter for anything relative to helping me accomplish my OP's mission(s). I'm not desperate for the $$$ but I'm also not one of those people that likes to rip other people off. That said; I value my time and money invested in this hardware and I'm only willing to unload it all once I've established I have ZERO need for any of it here in my home first. Hence my writing this lengthy thread in an attempt to repurpose at least a grand or two I've already spent. One of the most commonly asked questions I anticipate receiving from interested bodies is going to be "How hard were you on your hardware?" Contrary to what anyone else would have probably done in my scenario which is say they were light on it whether they were or weren't, I documented my handling of the hardware, and have no problem sharing such documentation with verified, interested buyers (WHEN THE TIME COMES) to offer you guys peace of mind. I have photo's and video's of the venture from A-Z. I am also obliged to provide (redacted) electricity bill statements where you can correlate my photo's (power draw on each rig), and also accurately deduct the excess power my house consumed with our other household appliances. Even taking into consideration how much (more) I spent in electricity from keeping my house at a constant, cool 70-72F year-round (via my Nest thermostat). Even without the rigs, I keep my AC @ 70 when I'm home and for the last 1.5-2 years, I just so happened to spend 85% of my time here at my house. When I would travel, I'd keep it at 72 for my wife & kids. Additionally; I had each GPU 'custom' oveunderclocke'd (MSI Afterburner for all GPU's but the EVGA's).* I doubt everyone reading this is aware so this is for those that don't.... EVGA had the brilliant idea of implementing what they call "ICX technology" in their latest NVIDIA GTX GPU's. The short(est) explanation of this "feature" goes as follows: EVGA GPU's w/ "ICX 9 & above" have EXTRA HEAT/THERMAL SENSORS. Unlike every other GTX 1070 ti on the market, the one's with this feature actually have each of 2/2 on-board fans connected to individual thermal sensors. Which means - if you were to use the MSI Afterburner program on one of these EVGA's and create a custom fan curve for it, you'd only be able to get 1/2 of the fans to function the way intended. The other fan simply would not engage as the MSI Afterburner software wasn't designed/coded to recognize/ communicate with an added sensor (let alone sensor'S). This, in-turn, would likely result in whoever's using it the unintended way having a GPU defect on them within the first few months I'd imagine... Perhaps if they had the TDP power settings dumbed down as much as I did (60-63%), they might get a year or two out of it since it wouldn't run as near as hot, but I doubt any longer than that since cutting off 50% of the cooling system on one of these can't be ignored too long, surely capacitors would start to blow and who knows what else... (Warning = RANT) Another interesting side-note about the EVGA's and their "Precision-X" OveUnderclocking software is that it's designed to only recognize 4X GPU's on a single system. For miners, that's just not cool. My favorite builds had 8X and for the motherboards that weren't capable of maintaining stable sessions on 8, I set up with 6X. Only my EVGA Rigs had 3 or 4X GPU's dedicated to a single motherboard. Furthermore, and as stated in an earlier paragraph, (& this is just my opinion) = EVGA SOFTWARE SUCKS! Precision X wasn't friendly with every motherboard/CPU I threw at it and their extension software for the CLC Close-Loop-Cooling/ CPU water-coolers simply didn't work on anything, even integrating into their own Precision-X software. The amount of time it took me to finally find compatible matches with that stuff was beyond maddening. (END RANT). Which leads me to my other comments on the matter. That's what I had every single 1070 ti set at for TDP = 60-63%. Dropping the power load that much allowed me to bring down (on average) each 1070 ti to a constant 110-115W (mind you, this is only possible w/ "Titanium" rated PSU's, Platinum comes pretty damn close to the Titanium though) while mining Ethereum and was still able to maintain a bottom of 30 MH/s and a ceiling of 32 MH/s. Increasing the TDP to 80, 90, 100% or more only increased my hashrates (yields) negligibly, like 35-36 MH/s TOPS, which also meant each one was not only pulling 160-180W+ (Vs. the aforementioned 115'ish range), it also meant my rigs were creating a significantly greater amount of heat! Fortunately for the GPU's and my own personal habits, I live in South Florida where it's hot as balls typically, last winter was nothing like this one. Increasing my yields by 10-15% didn't justify increasing the heat production in my house by >30%, nor the added electricity costs from subjecting my AC handlers to that much of an extra work-load. For anyone reading this that doesn't know/understand what I'm talking about - after spending no less than 2-3 hours with each. and. every. one. I didn't play with the settings on just one and universally apply the settings to the rest. I found the 'prime' settings and documented them with a label-maker and notepad. Here's the math in a more transparent manner: *** I NEVER LET MY GPU's BREACH 61C, EVER. Only my 8X GPU rigs saw 60-61 & it was the ones I had in the center of the build (naturally). I have REALLY high power fans (used on BTC ASIC MINERS) that were sucking air from those GPU's which was the only way I was able to obtain such stellar results while mining with them. **\* Mining at "acceptable" heat temps (not acceptable to me, but most of the internet would disagree = 70C) and overclocking accordingly brings in X amount of yields per unit. = 'Tweaking' (underclocking) the GPU's to my parameters reduced my yield per unit from -10-15%, but it SAVED me well over 30-35% in direct electricity consumption, and an unknown amount of passive electricity consumption via creating approximately 20%+ less heat for my AC handler to combat. I say all this extra stuff not just for anyone interested in mining with their GPU's, but really to answer (in-depth) the apparent questions you people are asking me in PM's. Something else that should help justify my claims of being so conservative should be the fact I only have/used "Platinum and Titanium" rated PSU's. Heat production, power efficiency and longevity of the hardware were ALWAYS my top priority.* . I truly thought Crypto would continue to gain and/or recover and bounce back faster than it did. If this project had maintained positive income for 12 months+, I'd have expanded one of our sites to also cater to GPU mining on a gnarly scale. Once I have my NAS (& possibly 2nd rig for the firewall) successfully built, I'll be willing/able to entertain selling you guys some/all of the remaining hardware prior to launching on eBay. If there's something you're specifically looking for that I listed having, feel free to PM me with that/those specific item(s). Don't count on an immediate response but what you can count on is me honoring my word in offering whoever asks first right of refusal when the time comes for me to sell this stuff. Fortunately for me, PM's are time-stamped so that's how I'll gauge everyone's place in line. I hope this extra edit answers most of the questions you guys wanted to have answered and if not, sorry I guess. I'll do my best to bring light to anything I've missed out on after I realize whatever that error was/is. The only way anyone is getting first dibs on my hardware otherwise is if they either offer compelling insight into my original questions, or have something I need to trade w/. THE END (Round#2)
I haven't looked at video cards since the GTX 1070 was launched. Help me catch up?
I preordered the GTX 1070 straight from Nvidia after the keynote. Right before the whole bitcoin boom for graphic cards. I'm thinking about upgrading. Help me catch up with the current times? Nvidia vs AMD, GTX vs ???, how long have the latest been out, how long until an update? I know I can research this myself and I'm currently poking around on Newegg. Just looking for some input or maybe an ELI5 response?
Intersecting and competing interests of miners vs. investors
This post is pure speculation but it's something that I've been thinking about for a while. This post is informational - it's not a quick FUD/FOMO analysis. However, I do make a case for being a long-term bull (i.e. years). There are two major groups with large individual resources: miners and crypto investors. These aren't your general traders, these are large, multi-million dollar groups (or larger). Let's look at motivations of both to see how it can relate to prices. Crypto Miners Miners obviously want maximum profit. There are several ways to do this:
cut costs by buying cheaper hardware. Due to the crypto market bonanza in 2017, prices for various rigs have skyrocketed, even ASICs.
increase price of crypto. If you can't cut costs, increase price of crypto through market manipulation (basically market buys which wipes out order books).
Note that Bitcoin's difficulty is at all-time high. Litecoin too. Increased difficulty means the same equipment will take longer to generate the same reward. Also note that with the upcoming halving - coming in a month for Litecoin and next year for Bitcoin - the reward for each crypto will significantly decrease. This means that - all else being equal - the profit for miners will drop significantly (temporarily, at least). The other news is that your typical miner isn't making a lot of money. Like many other examples, economies of scale come into play and your big investors that have large facilities and equipment are the ones making more money. This means more power in the hands of fewer people who have a larger investment with their various interests. How is an individual going to compete with something like this? Also note that when the crypto market fell at the end of 2017, miner manufacturers had losses due to lack of new buyers. This led to a collapse in prices for various ASIC equipment and related hardware. This does affect stock market prices. Although crypto hardware isn't exactly a huge profit center, check out stock prices for AMD, Intel, and NVidia for the last 5 years. You'll see articles like this and this that support my conclusions. Someone could dig more into this to get better numbers. Crypto Investors Crypto investors (the whales), don't really care as much about buying vs. selling - they can profit in either move in the price. However, shorting is risky and shorting crypto is very risky so more are likely to err on the side of growth. It also benefits them for any large swing in prices as opposed to steady growth. They want the market to continue to grow since if it shrinks, it can be destroyed and their profits will go away. They also don't want the market to get too large too fast but some things are beyond their control once they overheat. They're frustrated since they want to pump a lot of money into this - for massive profits - but this attention will be noticed. For instance, if some whale invests $50b into Bitcoin, it'll cause havok on the market and the prices so they have to have relatively small investments. The big institutions want to throw more money into it but they know that if they do, the market will get out of hand. Being noticed invites unwanted regulations and this leads to loss of control and, likely, lower prices with less opportunity. Note that the interests of both miners and investors sometimes overlap. For instance, miners want the crypto price to be higher so they have higher profits. Investors will also receive the rewards through higher prices. However, sometimes their interests are in conflict. For instance, if I was running a mining business and I had some resources, here's what I see: an increasing rise in costs due to higher ASIC prices, lower reward due to higher difficulty, and lower reward due to halving. What's my solution? I would:
try to manipulate the market to raise the price of crypto to make mining more profitable
from time to time, try to crash the market to make equipment prices collapse so I can stock up on new equipment and then raise the market again
prices of various ASICS have fallen by over 2/3 and are coming back up again. If I can have a 66% off sale to replace my equipment so I can buy it then I would do that and use profits from shorting crypto to buy the new equipment and then wait for the market to spike again (and help it along)
You can see how investors could be working for this where some miners could get money together to hire professional traders to do this. Same with companies like AMD, Intel, NVidia, and others (ex: Samsung) who stand to make a lot of money selling this equipment. The simple problem with crypto is for it to succeed:
market needs to continue to grow
as a result, more halving events will continue to happen (mathematical certainty)
meaning rewards will continue to decrease
difficulty will continue to increase
so if prices stagnate, miners will be out of business
The only solution is for the miners - and their suppliers - to continue to pump crypto prices higher to maximize their profits... indefinitely. Investors help out with raising prices but they also help when the market overheats and they cash out and/or short. A market crash temporarily helps miners who can now buy cheaper equipment. We've all seen charts like these. How else can you explain such projectors (due to past history)? You do that with the continued - almost mathematically calculated - rises and falls in prices over time. If you add in difficulty, ASIC prices, and miner profitability, I'm sure you'll see a pattern. Larger difficulty (i.e. more costs) and higher hardware prices require higher crypto prices for miners to continue to be in business. Considering the market is still relatively small, it's easier to manipulate for higher prices.
With the Xbox One X (Scorpio) on the horizon I've heard a lot of talk about teraflops and how the One X will perform the same as a 1070 because they are close in terms of teraflops. I'm going to try and shed some light on the subject. First off, what is a teraflop? Well, it's a measure of floating point operations per second. Or in layman's terms "how fast a GPU can do math." So that means higher numbers are better, right? The higher the number the faster the device can add, subtract and multiply numbers. Well not exactly. Teraflops are really only a good measurement of performance if you're only running complex math and doing nothing else (think bitcoin mining or physics simulations). For example: The rx 480 has 5.8 Tflops and is a pretty capable GPU. However, the 980Ti only has 5.6 Tflops. Now wait a second. The 980Ti wrecks the 480 in any gaming benchmark. How come a GPU with a lower Teraflop rating can outperform one with a higher rating? To quote EuroGamer:
Teraflops are a very basic measure of computational power, separate and distinct from all other aspects of GPU design.
Teraflops really have very little to do with gaming performance, because there are lots of other things that impact gaming performance (vRAM bandwidth, cache, etc). In short, the Xbox One X probably won't perform on the same level as a 1070. The One X has 6 Tflops, but seeing how the 480/580 (which is very similar to the One X's GPU) stacks up against the 1070 we can't reasonably expect the One X to do much better. *Edit: Of course the Xbox isn't out yet, so have really have no idea on how it performs until it gets in the hands of Digital Foundry (or similar channel) who has the ability to analyze it's performance in. And just to touch on Forza, the only it was running at 1080p60 on the OG One was with the dynamic resolution and settings. I suspect that will be implemented on the One X with Forza 7.
Nvidia sued... AMD-related: How did AMD manage channel better?
TL;DR: Nvidia and AMD both increased production in light of crypto mania. Did AMD manage inventory better? Discuss. Tom's Hardware: https://www.tomshardware.com/news/nvidia-class-action-lawsuit-cryptocurrency-amd,38304.html "AMD responded to the increased GPU demand byboosting production, while Nvidia began stuffing the channel to help push pricing back to sane levels. But then the crypto craze fizzled as the value of Bitcoin plunged, dragging down the other virtual currencies. Suddenly a flood of used graphics cards hit the market at low prices, exacerbating the reduced demand in retail channels. That left Nvidia with "one to two quarters" of oversupply for some of its graphics cards, most notably the GTX 1060. The oversupply purportedly delayed the release of the Turing 2060 cards and also led to the first of many punishing rounds of losses for Nvidia in the stock market." My Question 1: AMD boosted production, and Nvidia "stuffed the channel" only by boosting production as well. Now, given that Nvidia cards were more available than AMD during the mania, can we conclude that Nvidia (over) produced a whole lot more cards? How come AMD was smarter? My Question 2a: Are investors missing one thing... Even with channel-stuffing Turing was a new generation after 2 years of Pascal. Was the mistake Nvidia made over-emphasising RTX as opposed to having that as a "nice-to-have" (decimation of framerates was shocking, to say the least, and to this date only ONE game supports RTX). What the heck happened? How can AMD learn from this? My Question 2b: Was Nvidia's other big mistakes dropping SLI from the 2070? Having Gsync too expensive? Does this mean once more people enjoy Freesync they will realise Nvidia is the poorer choice vs AMD, by the fact of Freesync 1 or 2 ALONE? Or because Crossfire still is possible whereas only select "elite" Nvidia purchasers can use multi-GPU? My Question 3: What does this mean for Navi? My Question 4: Did Nvidia over-bank on AI? How can AMD avoid over-reliance on this alleged AI market? Given AMD has ostensibly less AI market share, even with Nvidia's AI gains, they still depend so heavily on constant gaming channel revenue? Something doesn't jive for me here.
So I finally gave Honeyminer a try. (my personal semi-review)
This review was last updated 11-30-18 When I first was interested in trying this program I couldn't find anything about it. it seems a lot of people were too scared to try it since their is like no information about it other then from the web page itself. to be honest I was a bit scared to try it. I've tried many other software of this kind, on a "test" machine I'm not afraid to lose on a secondary network and router... incase its a scam or gonna give me a virus and I suggest anyone installing mining software do the same as a rule of thumb. please keep in mind the software is still relatively new and they are working to improve it still. They seem to be hiring as well if your interested in helping them grow by working for them look near the bottom for their contact e-mail. ____________________________________________________________________________________________________ This review is for the windows version of Honyminer Because its still relatively new I knew could go one of two ways "sacm software" like most every mobile mining app or even quite a few desktop ones - Or legit. I'm glad to say after using it for a month it seems legit. I was able to withdraw from it no problem. If your system is really crappy It might not work that well on your computer or mining rig. There are no ads and the program doesn't seem to disrupt any day to day activity at least not on my main system, however you can of course expect increased heat production of your system as with any mining software, adequate cooling is important in mining. Anyways Honyminer is as close to an easy one click mining software as I have come. they seem to be making a "pro" version too for more hardcore miners. They do take a fee which is to be expected *look near the bottom for fee information\* but that fee goes down significantly if you have multiple GPU's mining.. The good thing about it for me was it let me kind of set my rig to "autopilot" so to speak. If you wish to see the H/s numbers in real time, go to you settings and view the "expert logs" which will also tell what coin is being mined at the time ____________________________________________________________________________________________________________ Pros
Withdrawals (I know I shouldn't have to say this but some mining software is a scam and wont withdrawal anything. This was tested with coinbase only so far and it went through with no issue.
(new) If you go to your dashboard > Activity on their site you can see a list of all GPUs/CPUS and computers that are minding with information about their temperature, the coin they are currently mining, number of cores, and the potential 24 hour revenue for each. This is just like the "see full activity" feature in the software itself but you can check it from anywhere
(new) You can set the app to only mine via GPU or CPU if you so choose in settings.
(new) a miner console has been added which should make some of the more experienced miners a little happier.
when you click "see full history" it takes you to their webpage where you can see all the transactions (where your Satoshis came from) and are labeled according to how they were acquired (Mining Credit, Mining Bonus, Referral Mining Credit, Referral Mining Credit Tier 2, and Bonus (meaning other kinds of bonuses like from leveling up) They are all time stamped and have an ID number
Easy to use/easy to instal I literally had no trouble setting it up or installing it. it was quick and easy
GPU and CPU mining
Mines many different types of cryptocurrencies depending on what's more profitable at the time (autopilot)
withdrawal as BTC or (it says in the withdrawl section "coming soon ETH, LTC, " but I dont think its a priority yet and Im not sure if they scrapped the idea of USD withdrawals all together or not but I don't see it there)
Idling option: for example soon as you use your mouse or type it will stop mining.
appears in the "task manager" which Is another one I should not have to say but you'd be surprised how many fake mining software will not show up there or will be listed with a inconspicuous logo or disguised as a system process.
Works in system tray if you'd like to multitask and your system is up for it.
can be set to mine soon as you boot-up
Frequent mining "bonuses" you will probably see a lot of them on your transaction history.
A "level-up" system which I've not seen before that pays you extra Satoshis for reaching the next "level" think like exp on video games, you get rewarded for leveling up and the higher your level the higher the bonus generally. the "next bonus" will update the closer you get to leveling up.
You can use multiple computers/rigs on the same account and see them all from any system with the appinstalled.
2 factor authentication which IMO is a must for anything like this, set that up on their webpage asap.
earnings log which you can acass from the website manually or clicking "see full history" on the app
can see your earnings as USD or as BTC.
shows you a quick earning comparison between today, and the previous two days. (if you don't see it update the software)
"pro" version currently in the works which I look forward to trying.
1st and 2nt tier referral rewards.
referral profits DON'T come out of the person you referred profits they come out of Stax Digital's profits so there is no guilt for referring people to this product. I've seen or heard of referral programs that actually punish the referred folks by taking a commission of what the person would have made in addition to taking their normal fee... in this case it comes out of the fee that Honyminer already takes from all users and not anything extra as far as I know.
referee's also get rewarded like if you were to sign up from my links you would get 1000 free Satoshis just for installing the app. (if you prefer to sign up directly that's fine too but there is no signing bonus if you go that route (unless you use someone else's referral link) as far as I'm aware. Whatever works for you really.
team is open to suggestions/feedback, friendly, and respectful.
code is audited (at least at least that's what they say)
you can add multiple wallets on their webpage. and delete them at will.. another one I should not have to say but still even today some places will not give you that basic functionality.
able to see what type of coin each CUP/GPU is mining at the time. (check out the options and "see full activity"
Proandor con (depending on how you look at it)
uninstalling gets rid of most of the components that enable it to be used, but seems to save some of the logs and some other files (but I was able are to search for and remove em in file explorer. many programs of any kind do that always so it's not that big of a con to me but I can see how it may bother some.
you are not asked to create a password, they create one for you but you can change it once you have logged in if you wish from their website. This can be looked at as a good thing to some or a bad thing to others for various reasons. If this is no longer the case please let me know.
when clicking on the app to see your full history of transactions it will take you to their webpage and make you log in again sometimes. this is a good or bad things depending on how you look at it I suppose. I personally prefer having to log in again.
no graphs, +/- earnings overtime comparisons. but it does have some logs to see what your mining in the expert logs section but not as much information as I would like. (miners console was added that also has more detailed info) but im hopeful for the future. Every mining software that was any good started somewhere.
installer was still packed with the first version when I downloaded it onto another setup so yea you need to update it right off the bat. It doesn't take very long, but I like it when software packs installers with the latest version (I don't know if this has changed but if you downloaded it and its already the latest version let me know)
may have trouble initiate some GPU's although I cant possibly test for every kind I have put the ones that didn't work for me below and will update it also if anyone else tells me it doesn't work with a certain setup.
_________________________________________________________________________________________________ COMPATIBILITY: (sorry it keeps adding asterisks to the card model for no reason) WORKED ON: every nvidia card tested so far with card models dating back from 20014 to now.. Worked on some surprising low end and or old CPU and GPUs. like the AMD Radeon R9 380 card in addition to a AMD Athlon II X3 450 Processor and it mines just fine.. of course that processor doesn't make much on its own lol.. but thats an extra 2 or 3 cents per day by itself. I've also tested it with an i3,i2Most AMD cards worked but I ran into issues with a few so maybe it's easier for me to just tell you what did not work. DID NOT WORK ON: --- any of the AMD ATI Radeon HD 4250's tested so far (2) that particular card It didn't work at all for mining like never enabled the gpu but the cpu on that machine did work however it would generate an "error" on start up but otherwise did not disrupt the mining on that system except if I turned on idle earning mode, I would get a bunch of errors as it was trying to access the GPU. we need the functionality to enable or disable hardware individually I think. (errors or no errors it just seems like a good thing to have.) OR a system that had both a AMD Radeon R7 Graphics and a AMD A8-7650K Radeon R7, (4C+6G) which surprised me considering some of the things that did work lol... but I think it might just might be that one system, but either way can't vouch that it will work. That system was pre-built and wont allow the parts to be changed or easily removed to be worth the effort since I have to use it for other things so unfortunately I can't test these on another mainboard at least not with wasting some time, money and patients that Id rather dedicate elsewhere for now. I had some issues using one RX Vega 56 card but i think it's was just that card because another one did work just fine.________________________________________________________________________ FEESW/comparison to nicehash I'm not sure if this post will be helpful to anyone looking into this software or anyone whos looking to try a different mining software but if it dose great. -- nicehash charges the following fees as far as "selling/mining" or withdrawing. Payouts for balances less than 0.1 to external wallet 5% Payouts for balances greater than or equal to 0.1 BTC to external wallet 3% Payouts for balances greater than or equal to 0.001 BTC to NiceHash wallet 2% Withdrawal fees from NiceHash wallet Withdrawals from NiceHash wallet are subjected to the withdrawal fee, which depends on the withdrawn amount and withdrawal option. WITHDRAWAL OPTION AMOUNT TO WITHDRAW FEE Any BTC wallet From 0.002 (min) to 0.05 BTC 0.0001 BTC Any BTC wallet More than 0.05 BTC 0.2% of withdrawn amount Coinbase More than 0.001 BTC FREE - No fee. but they also say Minimum Coinbase withdrawal limit is adjusted dynamically according to the API overload._____________________________________________________________________________ honyminer fees are based on number of GPU's working. 8% for 1 GPU or for 2 GPUs or more the fee is 2.5%. The only withdrawal fee is the standard BTC transaction fee that bitcoin charges and it doesn't go to honyminer. When they add the other withdrawal functions that fee cam be avoided I suppose. _________________________ Earnings: in comparison to nicehash Update: sometimes software / test networks will give a view that can be off + or - a few percent compared to actual. A lot of different things can affect your earnings including where you are located in the world, I'm not sure how many of you uses more than one mining software day to day , ISP issues, crypto price fluctuation, updates to fee's, and inaccuracies in test software/networks can affect results. but I go back and forth between different ones from time to time and I think that's good practice to keep options open. I notice that honey miner seems to do better for me at night-time and early morning/afternoon is when it has the most trouble raking in the crypto's That said I've been trying to test to see how this compares to nice hash earnings, with two of my buddies. So this is an average between the 3 of our profits vs loss compared to nice hash, I'm using a two 10 GPU/ 3 cpu setups, while one of my buddies is using two 1 gpu, 2 cpu setups and the other is using two 30 gpu mini farm's. We each have 2 networks each located relatively close by *less than .5 mile the furthest one* one with honyminer running and the other with nice hash and we are looking over 24 hour periods When all three of us have the results for one day, we average our results together. In all we will be looking over a 14 day period. UPDATE: the results below were done well long before the latest update to the software so I do not know if they have changed, Id have to do another round or perhaps some from the community could give me their results and save me a bit of work. I'm not sure when Id have the time to dig into it again. Sorry that it took me so long before I could get on here to post the results of the last few days of the tests.
Day one: -5%
Day Two: +10
Day Three: +1%
Day Four: -6%
Day Five: -2%
Day Six: +11%
Day seven: +2%
Day eight: +1%
Day Nine: -5%
Day Ten: -11%
Day eleven: +8%
Day Twelve: +1%
Day Thirteen: +1%
Day Fourteen: -1%
Seem to be a bit smaller then nicehash at times and higher at other times. it seems to for me at least payquicker and it gets deposited in my nicehash account sooner than I expected. hopefully when they let up pick which coin to mine on our own it may help somewhat, and any of you who want to move smaller volume will probably benefit when they add the functionality to withdraw other coin/usd. anyways when their autopilot system works it works great but when it doesn't it's just "okay" for lack of a better word... _____________________________________________________ Contact: they have a contact us part on their webpage and they also have a reddit page which I was made aware of from contacting them https://www.reddit.com/HoneyMine Careers: If anyone is interested in working for them the job listings at the time of this typing were for Senior Java Developer(s) and Customer Service Representative(s) the email listed is [[email protected]](mailto:[email protected]). id suggest you check their site for the requirements I just added this part to the review as a courtesy if anyone's interested its not meant to be a focus of it. But I know we have some really talented people on reddit who care about the crypto world passionately so id rather give honyminer a chance to have some of those sort on their team since it might help improve the software faster for the end users.. if that makes sense. _________________________________________________________ UPDATE: If a question reminds me I left out something I think should have mentioned Ill try to add it here so ppl don't have to scroll all over the place.. I don't write many reviews (for anything) so I don't know if this one was any good or not but I hope it was okay.. and I'm still a new reddit user relatively. I just wanted to make this review mainly because there is next to no information on honyminer when I looked for it and maybe it can help anyone whos interested in it. browolf2asked Is it basically like nicehash then? : A: In a way, its like nice hash that its cloud based, but you get paid not just when your pool completes an order. there are no "buyers" only "sellers" if you look at it that way...I hope I'm wording this the right way.. It's just straight up mining and they take their fee but compared to nicehash the fees for "mining" are different karl0525asked: do you know if we can contact the honeyminer dev team and see if they will communicate here on Reddit. Might give them some good ideas what us miners are looking for? Worth a try maybe? Thanks: A: I submitted a question to their "contact us" part of their webpage and I got a reply from them, this is the message I received below: Thank you for writing in and for your interest in Honeyminer. We always welcome feedback and suggestions from our users. We are currently planning on expanding our online and social media presence. Please check our our Reddit page: https://www.reddit.com/HoneyMine
Based upon interest shown in my post here earlier today, the following is a ELi5 and AMA post on my perspective as a cryptocurrency investor and miner, specifically how I see the cryptocurrency space impacting AMD's performance in the near to medium term (0-3 years). My Background: I am not a computer scientist, and many on this form know significantly more than I ever will in regards to computing, computing hardware design, and software. Take this into consideration when reading my post, and feel free to open up discussion if you disagree with me. I am always looking to learn / assess new perspectives. I do though have a background in STEM, until recently have followed AMD, Intel, and NVIDIA closely in regards to consumer and enthusiast hardware release, and have been mining Ethereum on a hand-built machine for roughly the past year, and investing in crypto for a decent amount of time as well. Given this, I believe that I can provide insight into the cryptocurrency and crypto mining realm, which is tightly coupled to AMD's GPU sales. My Motivation for Writing This: About a year ago I was a daily browser of this sub. Check my profile history if you wish. It was this very sub that gave me confidence to make my first investments outside of a 401k. Through this sub’s members I laid a foundation for making future investments that I will carry with me through life. How I Got Started In Cryptocurrency: Ironically, my start in cryptocurrency came through this very sub. As a daily follower of AMD_STOCK, during the initial Ethereum run-up early last year AMD and NVIDIA GPU’s were selling like hotcakes. Prices for GPU’s released months prior were rising instead of falling. I had no clue what a cryptocurrency even was. I distinctly remember reading through a post on this sub explaining the GPU shortage. It was simply “Ethereum”. I don’t know why, but this post struck me more than it should have. How could a shortage of hundreds of thousands of GPUs, totaling millions of dollars, be summed up in one word? This was the entrance to the rabbit whole that is cryptocurrency, or what I think is more telling, the financial and supply chain tech revolution. Cryptocurrency Eli5: Cryptocurrency is currently so much more than Bitcoin. Cryptocurrency is currently the financial, supply chain, + whatever else it ends up touching, technology revolution that is currently taking place as we speak. Cryptocurrency simply is a set of protocols that allow monetary/data transaction, smart contracts (think “if a, do b”), and/or storage in a distributed and trustless way, without a middle man. Eli5: It is a system that allows you and little Johnny from down the street to pay each other allowance money for things, without your mommies needing to get involved to make sure no one is getting cheated (Peer to Peer Payments). It can also allow you and Johnny to make deals with each other, and Johnny won’t be able to get out of it by saying “just kidding” later on (Smart Contracts). In both of these cases, you and Johnny write down the agreed upon payment, deal, information on a piece of paper, sign your names, and then send it out to everyone you know. Once those people recognize your and Johnny’s signature they sign it as well (distributed ledger). If there are any disagreements later, you look at the piece of paper and see what actually happened. For much more detail, visit cryptocurrency or some of the other cryptocurrency subs. Proof of Work (PoW) vs Proof of Stake (PoS): I had talked previously about handing out a copy of transactions to other peers for consensus. I was referring to a distributed ledger. This allows those who use the network to look over previous transactions and come to an agreement upon past history, avoid double spends (someone giving the same dollar to two different people), and verify a user’s current funds. Well, it doesn’t exactly work like that, and different cryptocurrencies employ different “consensus mechanism’s”. IT IS THESE CONSENSUS MECHANISMS THAT ARE OF IMPORTANCE AS AMD INVESTORS. I’ll try to go through the most prominent ones below. Consensus Mechanisms: Eli5: They solve the question: What if you and Johnny both hand out copies containing different information? Who decides what the truth is? Proof of Work (PoW): Eli5: Proof of Work is like if you and Johnny hand out copies of your transactions to each of your classmates, the teacher decides that this isn’t a democracy, and that not everyone gets to vote on what they think happened. The teacher says that for each math problem in today’s math quiz a student gets right, they get one vote to put in the jar up at the front of the class. After the quiz is done and everyone puts their votes in the jar, the teacher then reaches in and grabs a random vote on if you or Johnny were telling the truth. It is then recorded. Also, the student’s who’s vote was selected gets a gold star today (mining rewards, what makes this all profitable for miners). How is AMD involved in this? AMD’s GPU’s are what solves the math problems for the students in this example. The more math problems that they can solve correctly before the quiz is over, the higher chance that they have at getting to decide what is recorded on the ledger, and thus receive mining rewards (free cryptocurrency). Proof of Stake (PoS): Eli5: Well the teacher decided that she didn’t like doing math tests anymore because they took too much time and thought that the paper and pencils consumed during the quiz’s were a waste of the school’s resources (electricity used in PoW). She decided that instead, each student would get one vote based upon how many gold stars (how much cryptocurrency) they already have. But the catch is, if a student is caught lying somehow on their vote, they get all of their current gold stars taken away. This is what is “At Stake” in the Proof of Stake model. How does this differ from PoW from an AMD perspective? Well, if you haven’t noticed, there are no more math problems to be solved in this model, thus high-performance GPUs are not necessary for PoS mining. This provides several advantages in terms of energy savings, but would not be good for AMD’s sales. The Current State of The Market in Regards to PoW vs PoS: Currently, a majority of cryptocurrencies operate on the PoW model, but that ratio is dwindling as currencies switch over to PoS models. PoS is seen to provide several advantages, with major ones being energy efficiency and a potential reduced transaction time. Major cryptocurrencies using PoW include Ethereum, Monero, Zcash, etc.. with the most profitable over the past year usually being Ethereum. Ethereum is currently planning on switching over to a PoS model, but that transition has been delayed, and now has planned to first transition to a hybrid model of PoW and PoS before fully transferring over to PoS. I have not heard any rumors from Monero or Zcash about transitioning over to PoS in the short term. My Perspective/Predictions on AMD GPU Sales Over the Short and Medium Term:
Cryptocurrency over the medium term will continue to flourish/rise. There may be a major “crash” in the future, but I believe that is at least a year away, and a crash event would still leave the total market cap higher than it currently is valued at ~600 Billion dollars.
It will be 1+ year before a significant portion of current major PoW currencies phase out PoW for PoS.
AMD will continue to sell out GPU products for the foreseeable future (~1 year) as 1 & 2 above create a recipe for sustained/increased profitability in cryptocurrency mining.
Long Term – PoW will likely fade away as PoS grows in popularity. I foresee this happening in the 1-3 year time frame. What happens to AMD? Well, if the transition happens fast, gaming GPUs will flood the market and their new hardware sales will obviously be challenged to compete. If the transition happens slower, I see the trend being less violent to AMD as a company if they can keep performance improvements from generation to generation up. Although there will still be a flood of cheap used hardware on the market, before sufficient hardware floods the market new higher performance hardware could be released making old hardware obsolete for mid to high end gamers. This would be a huge win for AMD investors as it would minimize any impact to sales.
Because of the statement above, pay close attention to the PoS transition timeframe for Ethereum. This will be the first mass selloff of consumer GPUs.
Things I did not Cover:
AMD GPUs are typically more profitable than NVIDIA’s for cryptocurrency mining and why.
You cannot mine Bitcoin with consumer GPUs profitably. They require custom hardware (ASIC).
Getting into the actual process of how to mine (see the many Ethereum mining subs like ethermining for answers).
Have I made a profit – Yes, I have paid off my investment and then some.
What do I think of mining vs just investing – Okay I’ll answer this one. I personally would choose to invest directly into the cryptocurrencies over mining, unless you are using your existing gaming GPU, as I believe that investing will yield potentially an order of magnitude higher ROI over the next 2-5 years. Start with cryptocurrency and go from there. If you have specific questions, feel free to PM me. This is coming from a miner mind you.
My exit plan for the market? Well, I’ve stated above that I think a major crash (greater than 50%, we see 50% crashes every 3 or so months, but these are often largely exceeded by gains after) in this market will likely dip to current or slightly below current total market cap. I could be wrong though, but that’s a risk I am willing to take given my deep dive on this space. I currently hold currencies that will pay PoS mining rewards. I plan to sell these rewards.
Thanks for reading guys. I hope you found some useful information. If you have questions or see anything you disagree with feel free to comment! TLDR: I see cryptocurrency, cryptocurrency mining, and thus AMD GPU sales holding strong for the foreseeable short term ~1 year. This is just my opinion, do your own research, I could be wrong, but I live in this space.
Although be it that I don't have a complete mastery of the concept either, I will do my best to use my limited knowledge to explain what exactly the heck is going on with this mystical "Litecoin Mining" and the disappearance of AMD GPUs worldwide. What exactly is Litecoin mining? Litecoin mining, or cryptocurrency mining in general, is the use of processors to hash out a value and submit this work in something called "Proof of Work" that is mutually recognized. To put this in an analogy, think of a 3rd grade math classroom. The teacher writes a math problem on the board, say "2+3", and she tells the class to find the answer. Quickly, each student begins to work out the problem, and when a student thinks they've found the answer, they raise their hand and respond, "The answer is 5!". The teacher will then tell the student whether they are right or wrong, and the rest of the class will listen and understand that the one student got the answer right. Then they clap for him, recognizing his achievement and work needed to arrive to the answer. Now, in terms of mining, this difficulty is scaled to immense amounts, but the basic principle is the same. You, the miner, are the student trying to find the answer. The Litecoin master algorithm is the confirming teacher. And the rest of the miners are the other students in the classroom, recognizing your work. Litecoin itself is a derivative of Bitcoin, which is basically another cryptocurrency. Sources: Litecoin Website and Bitcoin (something like Litecoin) So what is so important about this answer? In reality, nothing. It just solves a math problem. But that's all the reason. When the Litecoin algorithm sees you've found the right answer and everybody agrees you found it (so no cheating!), then it will give you a "block reward". Like the teacher handing you a piece of candy for getting the right answer. This block reward contains an amount of Litecoins. What is a "Litecoin"? A Litecoin is a medal or proof of work done, and it is given through the reward system aforementioned. Litecoins are just numerical values stored inside of the "blockchain", which you can think of as the classroom. All of the players of this litecoin mining are located within this classroom, located within the blockchain. When everyone recognizes your work, you are given litecoins, which can only be accessed within wallets. You may think of litecoins as a currency, like pennies or nickels. They are something that we give value, yet do not have inherent value themselves. A paper bill is just paper unless we give it value. So what is a wallet? Wallets are where litecoins are stored, using a private key and a public key. This private key is a hash of base 58 that is completely randomized, such that only knowing both the private and public keys allows access to this wallet. The public key is shared with others who want to send you litecoins, while you keep the private key to yourself, which allows you to spend those litecoins. Now, the ultimate question: Why are they taking our graphics cards?! Ah yes. Now to talk about AMD GPUs. Litecoins use the scrypt algorithm, a memory and computation intensive algorithm that requires many cores and lots of fast RAM. Graphics cards fit both of these profiles perfectly. A graphics card has hundreds or thousands of computation cores, as well as lightning fast GDDR5 memory. As such, graphics cards are snatched up by miners wanting to create money out of thin air, which is basically what they're doing. But why AMD? Why not NVIDIA? AMD graphics cards are more efficient at moving bits than NVIDIA graphics cards, further explained here. All you really have to know if that the AMD architecture is much more efficient for cryptocurrency mining. But why NOW? Bitcoin/Litecoin has been around for awhile. Why are miners stealing our graphics cards NOW? Well, to put it bluntly, it's because people want to get rich. Recently, as further adoption by companies and the media coverage of cryptocurrencies, Litecoin price has skyrocketed. I'm talking magnitudes of 100 fold. Due to this rapid increase in value, miners have been rushing for more graphics card to make more money. If you take a look at this source, you can see that cryptocurrency mining profitability is insanely high compared to Bitcoin, which is Litecoin's main competitor. Look at the right column in the table that says "Profit Ratio vs. BTC". See the numbers in the 3000% range. Yeah, that's a lot of profitability that can be taken advantage of. Will we ever get our beloved 7950s back here in /buildapc? Q_Q Maybe. Delving deeper into what is litecoin, I'll now explain difficulty. As you find more of these answers, with more students working to find the answer to a math problem, the teacher must create more and more difficult math problems for the students to solve, else these little 3rd graders will solve them faster than the teacher has enough candy for. In litecoin mining, this equates to difficulty level. As the difficulty increase, litecoins are harder to hash out, and the rewards diminish. As more students/miners join the blockchain, mining will start to get less and less profitable. HOWEVER. If the price of litecoin skyrockets again, mining will be able to sustain profitability, and we may be out of AMD graphics card for a LONG, LONG TIME. Not to mention, once litecoin becomes unprofitable, if it indeed does, miners will move to other derivatives of cryptocurrencies, restarting the cycle and continuing to snatch up graphics cards. It is unlikely that this trend continues, but if it does, NVIDIA may be the only option for new gamers. Give me a TL;DR summary! I'll try my best. Poor people are sitting in a basement making money of out thin air using our graphics cards. They are making $12 a day per graphics card by basically only paying for electricity. This trend will continue for quite some time, meaning no more AMD graphics cards for a while. If you're a gamer, look towards NVIDIA, or be prepared to throw down a lot of cash. I obviously did not cover EVERYTHING in this post, but the main principles are there. If you want to delve deeper into this, I'd suggest doing your research and looking on the Bitcoin Forum. EDIT: Please check out /litecoin and /litecoinmining as well. Lots of good information.
Sup-sup Monteros! :-) Here is report from XMR.RU-team! I want to thank you for your support and your donations. The following articles were translated into Russian and posted not only on XMR.RU but also on Bitcointalk, Forum.Bits.Media, Golos.io, Steemit, Medium and Facebook:
-- Who we are? Group of Monero enthusiasts from Ukraine and Russia. What are we doing? We spread the word about Monero for the whole CIS. You can support us, so we can translate more interesting stuff about Monero. XMR: 42CxJrG1Q8HT9XiXJ1Cim4Sz18rM95UucEBeZ3x6YuLQUwTn6UWo9ozeA7jv13v8H1FvQn9dgw1Gw2VMUqdvVN1T9izzGEt BTC: 1FeetSJ7LFZeC328FqPqYTfUY4LEesZ5ku Here you can see for what all donations are spent on. ;-) Cheers! :-)
Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.
I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom. …Only problem: much of what they say is wrong. There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other. Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.
“PCs can use TVs and monitors.”
This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up. I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080. I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.
“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."
Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC. Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go! Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered. Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy! Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way. Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.
“On PC you could use Steam Link to play anywhere in your house and share games with others.”
PS4 Remote play app on PC/Mac, PSTV, and PS Vita. PS Family Sharing. Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console. In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system). PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game. Need I say more?
“Gaming is more expensive on console.”
Part one, the Software This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks. Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new. Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount. Part 2: the Subscription Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right? Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly. Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee. Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts. Let’s look at PS Plus for a minute: for $60 per year, you get:
2 free PS4 games, every month
2 free PS3 games, every month
1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72freegames every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month. In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still. All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts. Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst. Part 3, the Systems
Xbox and PS2: $299
Xbox 360 and PS3: $299 and $499, respectively
Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off. Well, keep in mind that the generations here aren’t short. The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total. And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention. Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware. Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually. Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines). Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway. Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.
“PC is leading the VR—“
Let me stop you right there. If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold. Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone. If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC. Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR. …Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.
“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”
This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam? GTA V
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis. But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right? No. Not even close. iRacing
CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
Memory: 8 GB RAM
GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games. Subnautica
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting? Low-end PCs. What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers. Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars. I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:
“PCs are more powerful, gaming on PC provides a better experience.”
This one isn’t so much of a misconception as it is… misleading. Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners). Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle. These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up. Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that. Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance. Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X. Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…
“You pay a little more for a PC, you get much more quality.”
The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time. For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
1.35 GHz base clock
2 GB VRAM
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs. Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
1.29 GHz base clock
4 GB VRAM
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part. But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance. The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
1.5 GHz base clock
3 GB VRAM
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much. Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story! Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
1.5 GHz base clock
6 GB VRAM
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story. I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99. Well, let’s see what Tech Power Up has to say... 94.3 fps. 74% increase. Huh. Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
1.6 GHz base clock
8 GB VRAM
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world? Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story. You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option. In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X. On another note, let’s look at a PS4 Slim…
800 MHz base clock
8 GB VRAM
…Versus a PS4 Pro.
911 MHz base clock
8 GB VRAM
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here. It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games. …That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7. The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.
“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”
Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team. This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough. On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder. Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them. Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion. Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.
“There are more PC gamers.”
The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million. Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent. For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales. But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million. This isn’t uncommon, by the way. Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total. EDIT: There were other examples but... Reddit has a 40,000-character limit.
This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform. I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across. I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, thisisn’t “anti-PC gamer.” If it were up to me, everyone would be a hybrid gamer. Cheers.
U.S. stock futures are marginally lower as investors await official domestic GDP data and another batch of corporate earnings to wrap up the week. Economists surveyed by The Wall Street Journal estimate GDP rose at a 2.5% annual rate for the quarter, which would mark the economy’s strongest January-March performance in four years. In Europe, the Stoxx Europe 600 was 0.1% lower in morning trade, with most sectors and major bourses in the red. In Asia, China’s Shanghai Composite fell 1.2% and Japan’s Nikkei closed 0.2% lower. Deutsche Bank profit surges but full-year revenue target cut Deutsche Bank (NYSE:DB) reported stronger-than-expected Q1 net profit of €201M ($223M), up 67% from a year ago. But revenues fell 9% for the quarter to €6.35B, the ninth straight quarter of revenue contraction, and the bank says it expects full-year revenue to come in flat after previously targeting a slight increase. Income from buying and selling securities fell 19%, marking the investment banking division's weakest Q1 since the financial crisis. The results follow Deutsche Bank's failed merger talks with Commerzbank (OTCPK:CRZBF, OTCPK:CRZBY), leaving Europe’s once-dominant financial institution to find other ways to boost profitability and revenue. Japan eyes possible GDP contraction as Q1 factory output declines Japan's industrial output fell in Q1 at the fastest pace in nearly five years, suggesting the economy may post a mild contraction in the quarter. Industrial output for January-March slipped 2.6%, the biggest decline since Q2 2014, according to data from the Ministry of Economy, Trade and Industry. The sharp decline shows the extent of the damage caused by the U.S.-China trade war, but economists are optimistic that Japan's economy can rebound quickly, since global growth remains on relatively firm footing. Chip stocks on watch after Intel slashes full-year guidance Intel (NASDAQ:INTC) plunged after-hours after cutting its financial guidance for the full year and reporting its first decline in sales of data center chips in seven years. The company said it now expects 2019 earnings of $4.35 per share on revenues of $69B, below its previous forecast for $4.60 earnings per share on $71.5B in revenues. “We're taking a more cautious view of the year, although we expect market conditions to improve,” CEO Bob Swan said, also taking note of the "challenging NAND pricing environment." On watch: Micron (NASDAQ:MU), Nvidia (NASDAQ:NVDA), Texas Instruments (NASDAQ:TXN), Advanced Micro Devices (NASDAQ:AMD). Amazon profit doubles; core Prime getting one-day shipping Amazon (NASDAQ:AMZN) reported Q1 results that beat earnings per share estimates by a hefty $2.44 and met revenue estimates with 17% Y/Y growth. But sluggish retail sales overseas and a flat performance from Whole Foods weighed down revenue growth for a fourth straight quarter. Amazon’s operating margin climbed to 7.4% in the quarter as expenses rose 12.6%, the lowest percentage gain in at least a decade. Expenses likely will move higher, in part because Amazon said it will invest $800M to make one-day free shipping the new standard for core Prime members. Musk, SEC again ask judge for more time to resolve contempt dispute Elon Musk and the SEC are seeking more time to work out their dispute over whether the Tesla (NASDAQ:TSLA) CEO violated a court order restricting his Twitter use. In a filing late Thursday, Musk and an SEC counsel asked a U.S. District Judge to extend their deadline for presenting the court with an "agreement in principle" until April 30. The judge had already granted a one-week extension to continue negotiations after she ordered the two sides to resolve the SEC's request to have Musk held in contempt of court. Facebook broke privacy laws by exposing user data, Canada says Canada's privacy commissioner said he will take Facebook (NASDAQ:FB) to court after finding the company’s lax practices allowed personal information to be used for political purposes. The report said it uncovered major shortcomings in Facebook’s procedures and that the company had rebuffed the commissioners' findings and recommendations. The Canadian probe comes as Ireland’s privacy regulator said it is investigating Facebook over a recent revelation that it had left hundreds of millions of user passwords exposed. Uber plans $44-$50 per share for IPO; PayPal to invest $500M Uber (UBER) reportedly is aiming for an IPO valuation of as much as $90B, seeking to price its shares in the $44-$50 per share range and hoping to raise $8B-$10B. Uber is expected to make the price range public in a filing Friday morning, which The Wall Street Journal reports will also include news of a roughly $500M investment in Uber by PayPal (NASDAQ:PYPL). PayPal already partners with Uber on processing its fares. Renault to propose merging with Nissan, reports say Renault (OTCPK:RNLSY) reportedly will propose to automaking partner Nissan (OTCPK:NSANY) that the companies merge under a new holding company where shareholders of each company would receive a roughly 50% stake. Renault is said to be moving quickly in proposing the holding company structure due to its concerns about Nissan's deteriorating business results. Renault's new Chairman, Jean-Dominique Senard, reportedly believes the move would allow the companies to move past disputes over the shareholding imbalance - Nissan owns a 15% stake in Renault, but Renault holds 43.4% of Nissan - and allow them to focus on Nissan's business recovery. Edison sued by Los Angeles County over wildfire damage Los Angeles County is suing Southern California Edison and parent company Edison International (NYSE:EIX) to recover $100M in costs and damages from a wildfire that may have been sparked by one of the utility's wires. It's the latest lawsuit against SoCal Edison since a fire last November that burned more than 150 square miles, destroyed 1,643 buildings and killed three people. The official cause of the fire remains under investigation, but Edison has said an electrical outage before the fire may have been caused by a guy wire and a jumper wire making contact. Goldman's 10 commodity trade ideas for the second half Goldman Sachs (NYSE:GS) has introduced 10 commodity trades for a second half it thinks will be based on shorter-focus considerations. Among metals trades Goldman is pushing: Long gold and short silver, long copper and short zinc, long palladium and short platinum, long 62% vs. 58% iron ore premium. In agriculture, Goldman is urging to go long the S&P GSCI Agriculture and Livestock Index. And in energy, it recommends being long cal20 WTI-Dubai; shorting Q3 2019 TTF (European gas) spread to Nymex (U.S. gas) prices; and being long the Q2 2020/Q2 2021 heating oil crack box spread vs. Brent. What else is happening... 3M (NYSE:MMM) knocks nearly 200 points off Dow following a dismal Q1. Citron’s Andrew Left takes no position on Tesla (TSLA). Berkshire’s (BRK.A, BRK.B) best use of cash may be buybacks. New York AG to probe Facebook’s (FB) contact collection. Thursday's Key Earnings Amazon (AMZN) +1% as profit doubles. Intel (INTC) -7.9% AH on 2019 guidance cut. Ford (NYSE:F) +7.7% PM after North America results shine. Starbucks (NASDAQ:SBUX) -0.5% PM on earnings beat. National Oilwell Varco (NYSE:NOV) -1.7% AH on Q1 earnings miss. Illumina (NASDAQ:ILMN) -4.4% AH despite Q1 beat, guidance raise. T-Mobile US (NASDAQ:TMUS) +1% AH on healthy Q1 subscriber additions, record financials. Cypress Semiconductor (NASDAQ:CY) +1.9% AH on beats, in-line guide. Juniper Networks (NYSE:JNPR) +0.5% after Q1 beat where sales, profits decline. Mattel (NASDAQ:MAT) +6.1% PM on slimmer loss. Seattle Genetics (NASDAQ:SGEN) -4.4% AH despite Q1 beat. GrubHub (NYSE:GRUB) +13.2% PM on Q1, diner metric beats. Proofpoint (NASDAQ:PFPT) -6.4% AH on downside FY profit outlook. Today's Markets In Asia, Japan -0.22%. Hong Kong +0.19%. China -1.20%. India +0.97%. In Europe, at midday, London -0.26%. Paris flat. Frankfurt +0.09%. Futures at 6:20, Dow -0.13%. S&P -0.09%. Nasdaq +0.11%. Crude -1.86% to $64. Gold +0.25% to $1,282.90. Bitcoin -1.5% to $5380. Ten-year Treasury Yield -1.2bps to 2.522%. Today's Economic Calendar 8:30 GDP Q1 10:00 Consumer Sentiment 1:00 PM Baker-Hughes Rig Count
Hey guys, was just curious what you guys are doing with your net profit from mining? I'm planning to reinvest my earnings into more GPUs to fill up my current motherboard PCI-E slots, so out of curiosity, I made out this graph to see what would happen if I were to only use my earnings to only buy more graphics cards to add to my rig. This graph is tailored for myself. I have a GTX 1080, and a computer capable of supporting up to 5 more GPUs. I assumed that I could continue buying 1080s for $500 as soon as I racked up that amount of earnings, and earned $10 dollars a week per card after electricity where I'm from. After 6 GPUs, I considered the price it would be to build a new rig for another 6 cards. I also assumed that I would be able to sell off the 1080s "used" for 50% of the price I bought them, making them an asset that added value to my investment. Finally, I know that there are simply too many variable to consider to properly make a precise graph. Bitcoin price may swing, GPU prices may swing in retail and used prices, mining rates may go up or down over time, my electricity tariff may be updated to higher prices, the first cards could have died by the 5 year mark, AMD and Nvidia could release newer cards which are more or less profitable vs. the 1080s, my house could have been hit by a flood and I lost everything, etc, etc... But assuming that current rates stay, I just wanted to highlight the possibility of incredible compound gains by continuously re-investing earnings to expand one's mining rigs. https://drive.google.com/file/d/1YfGcv2BeJhCM0m5XHzwwG10cEg5oluKH/view?usp=sharing
Huge surprise on upcoming Q2 earnings based on Rx480 580 sales?
AMD graphics sale has been around 2.7-2.9mil per quarter. With the recent surge in cryptomining, RX470/480/570/580 will continue to sell out for the remaining of the quarter. As you remember, last bitcoin rush in 2013/14 resulted in over 5mils of GPU shipment. Current sellout situation seems to be more intense vs 2013/14. Are we looking at Q3 '16 earnings jump that we saw with Nvidia? I remember those keplar gpus were selling out, commanding premiums like the rx470/480/570/580. Any thoughts?
[Discussion] My own personal guide to used hardware alternatives.
Hi there. My name is Jeff. I've been building systems for the better part of 15 years and try my best to contribute here actively. After being involved in this little community for a few years now, I'm noticing a serious lack of discussion about buying used components, and I feel like it's time to shed a little light on the subject for those looking to build on a (seriously) tight budget. As Linus said in his scrapyard wars video, buying new on $300 isn't practical, and if you posed the challenge to him on a random day, buying used is almost certainly the path he'd choose. As someone who's been "scrapyarding" as a hobby for the better part of 10 years, I figured I'd take some time to share some of what I've learned for the modern audience. Let's begin with a simple rundown of modern "budget" choices, and I'll tell you what I'd do instead. CPU The G3258 and Athlon 860k are the sub-$100 CPUs of choice, and both work just fine. I have built with both in the past, and each carries their own set of advantages. Used Alternatives: You can go in a couple of directions here; if you happen to have an LGA 1366 motherboard lying around, you can get an i7 920 or better for under $50, and they still hold up reasonably well. Being that LGA 1366 boards are not typically cheap when purchased used, my favourite option is the Phenom II x4 Black Edition series, each of which compare favourably to modern budget options, and will even overclock on some incredibly dated, dirt cheap AM2+ boards. In my experience, eBay prices on these get a little too high for my taste, but I've been able to nab several on Kijiji locally in Toronto for under $50 as well. GPU The R7 260x and GTX 750 ti are often cited as budget options for most builders, with the latter serving a very specific role in systems where power draw might be a concern. While there exists no option that can complete with the low consumption of the 750 ti (or even the single 6-pin connector goodness of the 260x), its performance can easily be matched (and exceeded) for less money. Used Alternatives: The bitcoin mining craze from a few years back led to the Radeon 7950 and 7970 being blacklisted on the used market, and I think the fears about burned-out cards are a little overblown. Here in Toronto, you can easily grab a 7950 for the price of a 260x, but I don't pay anywhere near that for my builds. At most, a Windforce will cost me $125, as where I recently picked up some non-boost edition PowerColor versions for a mere $83 each (bought 3 for $250). EDIT: Forgot to mention something important - avoid the reference 7950 and 7970. They were employed to a far greater degree in mining rigs because of their rear-only exhaust, and if you see a bunch of them from the same seller listed at once, they're likely old mining cards. Only pick them up if they're incredibly cheap. Want to go even cheaper? The Radeon 6950 (with the shader unlock, preferably) or even the 6970 will rival the performance of the 260x, and shouldn't cost Canadians more than $50-$60. I personally have 2 in my possession right now, and have gone through at least a dozen in the last 6 months. In general, one should always avoid Nvidia when buying used, because they are far too popular and overvalued for their performance as they age. I still see GTX 660s selling for $150, which is absolutely absurd. Motherboards Motherboards on the used market are weird, and this can largely be attributed to the fact that they're hard to transport and don't handle well over time. As such, people don't really sell boards on their own that often, and you'll likely have more luck finding a combo of some kind (or even a ready-to-go tin-can with no graphics card) for less per part than you will finding a given board on its own. Used Alternatives: The boards I'd recommend depend entirely on the CPU you've chosen. Being that I'm a fan of the Phenom II x4 series, AM2+ boards are going to be dirt cheap, but DDR2 RAM is actually fucking expensive, so you'd likely be better off going with AM3. I've even seen some used AM3+ boards (The 970 ASRock Extreme3, in particular) for as low as $40, so it wouldn't hurt to look. On the Intel side, you're actually at a significant disadvantage. Much like Nvidia cards, Intel boards (and CPUs) actually retain their value and don't often come cheap. For me, LGA 1156 is the price/performance sweet spot, granted I can find an i7 8XX to go with it. Even still, they're going to run you a fair bit more than an AMD board, and likely aren't worth it by comparison. RAM Ram is ram. DDR2 is pricy as fuck due to an obvious market shortage of the stuff, so the AM2+ board option might not be best by comparison. DDR3 ram, however, is ubiquitous, and I always die a little inside when people building on a "budget" choose to buy new at all. If I'm being honest, I can get DDR3 ram from e-waste recycling companies for as low as $10 per 4GB stick, at 1333MHz, and not once have I ever had a bad stick of the stuff. Even for people going the route of the G3258 (which only supports 1333MHz), this is the clear winner. Is value RAM ugly as sin? Sure it is. It is just as good as that fancy Ripjaws shit you've got in your current build? You betcha. Storage Hard Drives are actually a tricky game, as they are the single most volatile component in any budget build, easily succumbing to wear and tear from age and daily use. As such (and some might find this hard to believe) I actively avoid HDDs when building value systems for people and opt for cheap SSDs instead. As always, check the date on a drive if you're really insistent on buying one, and considering how cheap a WD blue is new, don't pull the trigger on one unless it's for less than $30/TB. SSDs are obviously (akin to RAM) highly resilient and are nearly guaranteed to work when purchased used. The average SSD pulled from an old laptop or an office off-lease desktop, will have no more than 100GB of writes on it, which leaves 99% of its life for you to exploit. While there exists no specific recommendation for which brand to buy, just be sure you're getting a relatively good drive with SATA III capability. 120/128GB variants of these sorts should cost you no more than $50 in my native Canada, and I've even gotten lucky on some larger sizes too. Recently I picked up 4 256GB Samsung 840 Pros for $75 each (I came), just days after I bought a Crucial MX100 of the same size for $85. Monitors Monitors are fun to buy, because the recent shifts in display technology have rendered a lot of recent-but-obsolete models nearly valueless. For example, remember when 16:10 was a thing? I actually still like 1680x1050 monitors, but the rest of the world seems to disagree, so I've been able to pick up 23" variants for as little as $40. Being that the slightly lower resolution actually eases the strain on your VRAM a bit, it's a nice fit for a lot of budget cards that might not have a full 2GB available, like some variants of the 6950. 1600x900 monitors are often just as cheap and come with the same inherent benefit of being obsolete despite being almost as good as its bigger 1080p cousin. Keyboards and Mice If you're on a budget, we can't even have this discussion. As much as I like mechanical keyboards and high-precision gaming mice, people building used $300 systems aren't going to allot any of their budget buying them. That said, wired USB keyboards and mice are virtually free (search your local goodwill or value village for some), and if you have to pay money, buy a wireless combo for $20 new from some little shit store in a suburb somewhere. Cases Cases on their own sell for about half of their original retail price, give or take based on the condition. I normally just get them as a part of a tin-can bundle and make use of them if they aren't too dirty, but when building for someone else, I'd often just prefer to buy a new budget case in the $40 range. PSUs I saved this topic for last, because it's by far the most difficult category to master. First off, you really need to do your research and understand how PSUs work before delving into these guys, as the cost associated is almost entirely dependent on how resilient the underlying platform has been proven to be. Generally speaking, reading reviews on JonnyGuru and HardOCP is a great start, but none of them account for units that are several years old. As a general rule of thumb, I use the EVGA 500W W1 as a reference point, and build my value tree around that. In other words, if a new EVGA 500W (a passable, proven budget unit) is cheaper than a used 500W variant of a better brand, why would I bother buying used? Sure, that 520W Seasonic S12II puts the EVGA to shame in terms of voltage regulation and ripple suppression, but can I really make the same claims of a unit that's 5 years into its life? Wouldn't I just be safer buying new? These are all factors you have to consider. For me, the threshold lies around 50% in terms of cost savings vs. risk. In other words, if you can find a used quality unit for less than half the price of the cheapest quality unit available at a given time, buy it. Anyhow I think that covers everything. And as a closing note, remember to be safe. Meet potential sellers (and buyers) in public, well-lit places, and try your best to avoid entering someone's home without some protections in place. Also, the more info you get about the person (address, phone number, etc) the less likely it is that a person will be trying to scam you. People who purposely conceal their identity do so for a reason. Also, feel free to ask me anything about my own experiences buying and selling used. I've been doing it as a hobby for a long, long time and have sold many value builds to people who can't otherwise afford PCs. I'm happy to impart any wisdom I might've gained over the years. Edit: CPU Coolers! Forgot those. Air coolers are a safe bet. They're slabs of copper and aluminum with fans strapped to them. Buy with confidence, and seek one out for $10-$15 if you plan to overclock. AIO water cooling is not so safe. Those things are typically only good for 2-3 years, and you have no idea how much longer a pump has before it gives. Budget builders likely aren't water-cooling anyhow, right? Edit 2: Just to be clear, when I said I'd been doing this for a long time, I should clarify that a) I once owned a game store and sold systems out of there and b) I currently resell systems out of my house to raise money for charity builds. I really don't want people to get the impression I'm trying to sell anything.
Nvidia Mining Vs AMD Mining. Nvidia and AMD are the two major players in the manufacturing of mining rigs, and there is no straight answer to which is better than which. There are areas where Nvidia manufacturer cards are better while there are areas where AMD made hardware trumps over Nvidia. To understand it broadly, AMD is the king of mid-range and budget mining rigs, and if you are someone ... One of the questions that's swirled around Bitcoin mining performance for the past few years is why Nvidia GPUs are so thoroughly outclassed by AMD products. We dig into the question, test some ... AMD Mining Vs Nvidia Mining. This part here is for people who want conclusions : End of the story. Nothing more nothing less! What is Best Mining GPU to Buy in 2018? But if you want a detailed description of Nvidia Mining and AMD Mining then please hang on to the latter half of this article for full Explanation. But if you don’t have time for ... Bitcoin Gold Mining wird lukrativer. Mit der richtigen Hardware und unserer Anleitung ausgestattet, müssen Sie Bitcoin Gold nicht selbst kaufen. 2. AMD vs Nvidia. AMD cards are almost always more suitable for the novice miner in terms of price, as the base of AMD mining cards cost almost 2/3 the price of its Nvidia counterpart. However, there’s a tradeoff. Nvidia cards are almost always easier to use, configure and overclock (i.e. squeeze more juice out of the card). Nvidia cards do ...
AMD vs NVIDIA: The State of 2020! Buying or upgrading your video card means making some choices, but make sure you make the right one that really meets your ... Heute geht es um folgende Werte: Nordex, SMA Solar, Bitcoin, Nvidia, AMD, Tesla, PSI Software. In seinem wöchentlichen Hot Stock Report setzt Florian Söllner antizyklisch auf Geheimtipps, über ... Bom, Fiz um comparativo entre a Linha Gtx da Nvidia Vs A Serie Rx da Amd, na Mineração e adivinha qual teve a melhor média nos algoritimos ? Usei o Site "What To Mine" para comparar e Utilizei ... Brian Armstrong Live: Coinbase Trading, Bitcoin Mining, BTC Price Stay Home NOW Coinbase PROMO 7,647 watching Live now Things you can make from old, dead laptops - Duration: 19:03. a short video explaining the differences between nvidia and amd gpu mining. Sign up with coinbase. buy or sell 100 dollars in crypto currency and get 10 dollars of bitcoin for free with this link ...