What's new
  • Visit Rebornbuddy
  • Visit Panda Profiles
  • Visit LLamamMagic
  • Visit Resources
  • Visit Downloads
  • Visit Portal

Trying to increase number of bots, need some help from experienced botters

anyone here tried a"cloud gaming" config? i mean with a badass nvidia tesla in HPC cloud computing ?
 
Yes it can... You cant use RDP to connect to the desktop while running D3 RDP does not support 32 bit color mode and D3 will error out. I use teamviewer to connect to the VMs.

Warbux I am using RDP to connect to my pcs while d3 is running. I run d3 at 800x600 and set remote client res to 1024x768. I had to set d3prefs.txt to read only as it kept trying to go fullscreen when remote desktop client went to connect which results in an error. Also instead of closing the rdp session like you would close a regular window, I use a batch file which does not lock the console when disconnecting. This is also useful for anyone using a pc as a media server, vnc/teamviewer/logmein etc. are all slow as shit. On lan with all the goodies cranked up through remote desktop connection, i can view my bots remotely in full color and at least 30fps, its like being in front of the actual pc. Make a batch file and put this in it for disconnecting.

@echo off
%windir%\System32\tscon.exe %sessionname% /dest:console
 
Last edited:
Warbux I am using RDP to connect to my pcs while d3 is running. I run d3 at 800x600 and set remote client res to 1024x768. I had to set d3prefs.txt to read only as it kept trying to go fullscreen when remote desktop client went to connect which results in an error. Also instead of closing the rdp session like you would close a regular window, I use a batch file which does not lock the console when disconnecting. This is also useful for anyone using a pc as a media server, vnc/teamviewer/logmein etc. are all slow as shit. On lan with all the goodies cranked up through remote desktop connection, i can view my bots remotely in full color and at least 30fps, its like being in front of the actual pc. Make a batch file and put this in it for disconnecting.

@echo off
%windir%\System32\tscon.exe %sessionname% /dest:console

Did you check the security settings to see if you could keep it from locking there?

I will look into the d3prefs.txt fix I was unable to start D3 while in a remote desktop setting but perhaps that will fix that issue as well.
 
ECC RAM has come down in price. I would look at the E5620 it is a great processor for what your doing has 12M cache and is built for VMs. You can start with 1 processor and expand as needed but I would start with 2 and just expand RAM as needed. Your limiting factor(s) are going to be GPU and HDD IOPS. You can combat this by adding more GPUs and buying a SCSI card (battery backup preferred) Instead of going SSD just throw in 4 HDDs in a RAID10 Config.

One of my servers will run 30-60 bots (obviously I would need to add more GPUs to hit this number)
 
what about this one? I read it's the successor to the E5620, roughly same price, 6 (12) core, but slower clock speed. Is high clock speed important in this kind of rig?
Intel Xeon E5-2620
Newegg.com - Intel Xeon E5-2620 Sandy Bridge-EP 2.0GHz (2.5GHz Turbo Boost) 15MB L3 Cache LGA 2011 95W Six-Core Server Processor BX80621E52620

Why is SSD a bad choice? most SSD these days have 100,000 IOPS, should be way more than enough for anything you can throw at it, do you mean that most server mobo's have bad controllers? Or is 100,000 IOPS actually not enough for running this many bots?
I read somewhere 1 VM performs 1,000 IOPS?

Also, is internet speed a factor here? I'm currently using FIOS 25/25 mbps, wondering if I'll notice any difference if I upgrade to 150/35 mbps

Lastly, I was lookin at this chart, Windows Server 2008 R2 - Wikipedia, the free encyclopedia
which windows 2008 R2 edition do I need?

if cpu is $400, and mobo is another $400-500... this is starting to seem like an expensive project. Is this really the best bang for my buck?
 
Last edited:
you get a barebone supermicro server It has everything already built in. PSU and mobo. Then add your hdds processors ram video card ect. With Hyper-V you cant use linked clones so HDDs are the way to go. You can use a SSD for cache. That processor would be better than the one I linked. The problem with onboard controllers is that they are onboard just like onboard video cards. A nice scsi controller is the way to go but you could start off using an onboard one to save money. Your internet connection is fine.
 
doesn't the barebone server alone cost $800-1000?... would it be cheaper to build it all from scratch?
Also, regarding GPUs.. could you specify each GPU to certain VMs, or by upgrading additional GPU you mean configuring them in crossfire/SLI ?

so, fully upgraded to 2 cpu and 50+ gb ram I could run 50 bots on this system? Or is this spec only for 20 bots?

appreciate all the help so far!
 
Last edited:
Read the first page and skipped to the end, so sorry if someone touched on this, but if I had to guess, you are running out of VRAM on your video card.

Additionally, you would want to avoid SLI and Xfire as it tends to share a lot of the resources and cost you in efficiency. Running the cards separately and dedicating one to a display each will allow you to pick which ones run on which GPU, or find a card that emphasizes in memory capacity, as you won't need the actual processing power of them so much.
 
Read the first page and skipped to the end, so sorry if someone touched on this, but if I had to guess, you are running out of VRAM on your video card.

Additionally, you would want to avoid SLI and Xfire as it tends to share a lot of the resources and cost you in efficiency. Running the cards separately and dedicating one to a display each will allow you to pick which ones run on which GPU, or find a card that emphasizes in memory capacity, as you won't need the actual processing power of them so much.

What do you mean by dedicating one to each display? I am running 10 bots on one monitor, and one GPU.
 
What do you mean by dedicating one to each display? I am running 10 bots on one monitor, and one GPU.

That was regarding multiple GPUs, as it seemed to be where the conversation is going :P

If a GPU is dedicated to a display and a DirectX application is initialized on that display, the GPU will be responsible for it until it is terminated. Such tactics come in handy with distributing the load of clients across your cards. :D

However, as a quick test to see if it is the problem, download GPU-Z and see what your VRAM is at!
 
by display are you referring to a monitor or each VM guest? cus I am never going to use more than 1 monitor for botting so I don't think that will be an issue


Warbux, i read that RemoteFX has a 12 VM per GPU limitation? How do you get around this if you want to run 50 bots? Workstation card? yet those are not cheap, i could be wrong cus I've never researched them much.
 
Last edited:
warbux, someone told me this, so I am not sure how a workstation would be economically feasible for botting now:

RemoteFX can only leverage 1 GPU. The number of accelerated RDP sessions is based on the VRAM (video ram).

You cannot pass through the GPU to the VMs in Hyper-V. Also, VMware ESXi only supports 2 PCIe device pass through and even this is a hit or miss depending on the motherboard used and the actual connection of the PCIe lanes.

which means my only choice is to get a really expensive workstation GPU.. the person also mentioned esxi.. would vmware ESXi be better than workstation at virtualization?

there's no way to build this for under $1500. GPU will be a couple hundred or thousand
The barebones server is $850, CPU another $400, ram another $400-800,
The raid controller will cost $400, the harddrives will cost a few hundred as well.

to get it to run only 20 bots will cost quite a bit... then again.. gettin a desktop to run 20 bots will cost quite a bit as well, since you cannot create linked clones in hyper-v =\
 
Last edited:
You can buy a used one and add a good GPU from there. Hell I have one I will sell you pre-built. Remote FX can use more than 1 GPU the only requirement is that all GPUs must be identical.

Hardware Considerations for RemoteFX


I dont know if that 12 session per GPU limit is accurate or not. I am using 2 460s each 460 has 2 GPUs I have never actually tried to push it past 40 concurrent sessions to test that 12 session limit.
 
Last edited:
hmm. are you in california by any chance? Yea I always buy used GPU and anything else I can get used. CPU and mobo are hard to find used though, even more so for servers, at least on craigslist.

would vmware ESXi be better than workstation at virtualization? I am trying to find a better solution for my current computers. i7-2600k build and 3770k. They only have 120 gb SSDs, so windows 2008 r2 with hyper-v wouldn't work..


I was lookin at this barebones server, I don't think the parts can be built cheaper individually, do you?

Newegg.com - SUPERMICRO SYS-7037A-I Mid-Tower Server Barebone Dual LGA 2011 Intel C602 DDR3 1600/1333/1066/800
 
Last edited:
dont even try to use esxi, the only d3d support is through cpu emulation or vga passthrough, where you need 1 gpu for each vm. Best bet is still workstation. Maybe in the future with nvidia vgx boards, we can use xen or esxi or some other bare metal.
 
esxi is out of the question. Your only real options are workstation/player or hyper-V... Supermicro bare-bone servers are the way to go. I have 5 of them here. 4 rack mounts and a tower. I want to get rid of the tower. I have it on craigs list but that really is specialized kit. I thought about putting it on ebay I have a lot of positive sales there but I just havent been motivated enough to bother. I can always find another use for a good server I run a pretty large network here at home Cisco 1921 Sec/K9 router, 2 layer 3 switches (cisco and procurve) plus god knows how many workstations.
 
if raid controller + drives will cost me $800, wouldn't it cost about the same to get SSD drive or 2? Sata 3 drives can be found for $1/GB or less.
 
my specs
AMD X6 6x2,8GHz
4x4GB RAM 16GB TOTAL
AMD HD6950
2000GB WD Green
9 Bots for me
CPU Load @95%~Peak
GPU Load @50%
 
if raid controller + drives will cost me $800, wouldn't it cost about the same to get SSD drive or 2? Sata 3 drives can be found for $1/GB or less.

The fact of the matter is your onboard RAID controllers (on any system) will never fully handle the capabilities of the SSD Drive. You will always need a hardware RAID controller if you want to truly get the most out of your drives in a RAID config. Anyone who tells you otherwise is smoking stuff that is illegal in all 50 states.
 
i see.. but with the SSD drives, why do I need to run them in raid config in the first place? Just one SSD by itself is many times faster than any HDD in raid configuration.

Anyway, to even consider getting another computer I want to max out my current set-up first. Like someone else suggested, i'm wondering if increasing VRAM would help here?
There must be something about my GPU I could change to make it run better. Switching to windows 2008 r2 server for my desktop computer will also cost some money because I only have 128 GB of HD space and each VM needs like 12-15 GB.
So I guess the question is, will I see more benefit by upgrading the GPU to 2gb Vram version, getting a more powerful GPU altogether, or adding more harddrive space and switching to windows 2008.


To remind of my specs again:

i7-3770k (clocked to 4.0 ghz)
noctua dh-14 heatsink,
Nvidia GTX 460 1GB
32 GB G.skill ram, 1333
120gb Vertex 4 OCZ Drive
650w rosewill psu
Linux Mint 13 x64 (VMware workstation 8.0.3)

With these specs most people can run 15-20 bots without VMware, using a multiboxing service such as Isboxer.

I however, with VMware, can only get 8 to run...(one bot per VM) if I try to run just 1 more, my fps lags and GPH drops significantly across all bots

Anyone know how much VRAM d3 uses at 800x600 lowest settings?
 
Last edited:
Back
Top