1. This is an off-topic section of the forum.
    If you have an issue related to BeamNG, please post in Troubleshooting section instead.

General computer talk/advice

Discussion in 'Computer Hardware' started by BlueScreen, Jan 25, 2015.

  1. vmlinuz

    vmlinuz
    Expand Collapse

    Joined:
    Mar 2, 2014
    Messages:
    2,409
    deleted because posting after 3am is generally a bad idea
     
    #7921 vmlinuz, Jul 21, 2018
    Last edited: Jul 21, 2018
  2. SixSixSevenSeven

    SixSixSevenSeven
    Expand Collapse

    Joined:
    Sep 13, 2013
    Messages:
    6,958
    Yes. Zero documented exploitations of it. And it really isn't as big a back door as you think, it's tiny and already requires you to have achieved code execution through another back door, all it does is allow you to break your security sandbox afterwards, through a means so sketchy nobody can even document it occurring in controlled conditions trying to force it to happen
     
    • Agree Agree x 2
    • Like Like x 1
  3. aljowen

    aljowen
    Expand Collapse

    Joined:
    Oct 21, 2012
    Messages:
    1,677
    Nothing is secure, be that in computers or otherwise.

    Given enough time, effort and skill, any security can be compromised. Locks can be broken, UEFI's can be infected, etc. However, these complex forms of attack make no sense to use, unless a target is of very high value. Most run of the mill stuff is phising attacks, why go to all the effort to target one random person with some super hardcore exploits, when you can more or less ask a certain percentage of the population for access to their computer, over the phone.

    Exploring systems is very much a financial or political venture. Target as many people as possible*, with the least effort possible. Or target a single entity, which could have a huge payoff.

    And even when targeting a large organisation, their security is usually good enough that its easier to use phising to gain access to the systems than exploiting the system itself. Since low level employees who aren't computer specialists, are way easier to trick than a complex security system.


    *preferably vulnerable people since they are less likely to know how to solve their issues, and are less likely to try and track the exploiter down. This is thought to be why many scams look obvious, they are intended to self select victims who have no idea what they are doing.
     
    • Agree Agree x 1
  4. vmlinuz

    vmlinuz
    Expand Collapse

    Joined:
    Mar 2, 2014
    Messages:
    2,409
    Update: got some sleep, headache went away, read my post again, it all looks very silly now. The way I had described it made it sound less like "highly theoretical potential issue with performance-oriented processors and already-broken software," and more like "P=NP discovered." Disregard... although I'll still be sticking with AMD where security counts - just found out that the Ryzen 3 will likely not be "vulnerable" to speculative execution attacks.

    Wait, hold up: I believe the rest of your post, but what about the proof-of-concepts? Those work more or less reliably (although they are essentially useless)...
     
    #7924 vmlinuz, Jul 21, 2018
    Last edited: Jul 21, 2018
    • Informative Informative x 1
  5. SixSixSevenSeven

    SixSixSevenSeven
    Expand Collapse

    Joined:
    Sep 13, 2013
    Messages:
    6,958
    Proof of concepts dont even work reliably. It relies on being able to predict the order of the out of order execution, which is incredibly difficult to do, and then also insert yourself into that misarranged ordering without altering that order. You need your instructions to execute while another instruction running in kernel privelaged mode is stalled to trigger that hack. Something the out of order execution does try to prevent, the bug is that in some circumstances it can hypothetically occur. But its near enough impossible to actually trigger it.
     
    • Like Like x 1
  6. fufsgfen

    fufsgfen
    Expand Collapse

    Joined:
    Jan 10, 2017
    Messages:
    6,781
    I'm thinking of upgrading 1050Ti finally to something faster.

    1060 6GB I could get for 300-340 (what I have read, that would be for example from 40fps to 60fps) That would be like really expensive upgrage
    1070 8GB would be 450-500 (don't know how much faster)
    1080 8GB would be from 520 to insanity (I guess it would be enough fast for whatever and at least 4 years to future)

    Last two would be like selling arm and leg, eating only lean soup for rest of the year.

    1080 that costs 520 is Zotac which makes me scared, but I guess by numbers that is giving best performance for money, however 1070Ti of better brand would be around the same and is there really huge difference between them?

    Then there is PSU argument, 1060 would be no trouble, not so sure about faster ones, they claim 500W which I'm very slightly shy of and my PSU does not have a fan, so it is not best PSU to be taxed to max even it is one of the best quality wise.
     
  7. SixSixSevenSeven

    SixSixSevenSeven
    Expand Collapse

    Joined:
    Sep 13, 2013
    Messages:
    6,958
    My 1070+2600K playing destiny 2 clocked 330W draw from the wall
     
    • Informative Informative x 2
  8. fufsgfen

    fufsgfen
    Expand Collapse

    Joined:
    Jan 10, 2017
    Messages:
    6,781
    That is not bad at all, with 6700 it would probably be even less. Thx from the info!
     
  9. SixSixSevenSeven

    SixSixSevenSeven
    Expand Collapse

    Joined:
    Sep 13, 2013
    Messages:
    6,958
    Just note, that's very specifically in destiny, it may vary on game and will vary on resolution. I'm 1080p60 still
     
  10. aljowen

    aljowen
    Expand Collapse

    Joined:
    Oct 21, 2012
    Messages:
    1,677
    I am running a 600w PSU, with an overclocked 2500k, GTX970 & GTX560ti, plus two ssd's, two hdd's, 12gb of ram.
    So with a 500w Psu you ought to be fine (providing it meets its rating, if its very cheap it could be a 300w PSU with a 500w sticker on it etc).

    If in doubt, Enermax has a Psu calculator that is pretty useful.
    https://www.enermax.outervision.com/
     
    • Informative Informative x 1
  11. fufsgfen

    fufsgfen
    Expand Collapse

    Joined:
    Jan 10, 2017
    Messages:
    6,781
    My PSU is one of the best in it's capacity I believe, but it is bit under 500W:
    https://www.techpowerup.com/reviews/Seasonic/SS-460FL_V2/

    Here are test results of my PSU, also with 500W load where PSU still holds up better than some at their rated capacity, I would not like to overtax PSU though, after all I'm attempting to get 10+ years from the system:
    https://www.techpowerup.com/reviews/Seasonic/SS-460FL_V2/5.html

    I guess that recommended 500W for those GPU's is to cater worst case scenarios, with AMD CPU? I'm not even fan of overclocking, so unless there is issue with rails, this might have enough power to do the job.

    Out of curiosity, I did test how much i7 6700 and 1050Ti uses when playing Beam and I did not manage to get higher than 136W.

    1080 would be 173W so that would be roughly 100W increase over 1050Ti, so I guess I don't need to be worried about power aspect, even though faster GPU probably increases power draw of the CPU/system but it can't be a lot, probably no more than 20W.
    https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html
    https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1050-ti,4787-6.html
    https://www.tomshardware.com/reviews/skylake-intel-core-i7-6700k-core-i5-6600k,4252-11.html (non K variants are only 65W max if I recall correctly)

    So 500W recommendation is probably for the AMD users with huge number of cores, maybe also to cater cheap under performing PSU's.

    4K ready is then one point to think about I guess, 60" 4k smart tv screens are getting affordable, especially 2nd hand, but I wonder if any GPU runs beam perfectly well with 4K and recent content additions, that is quite lot to ask from GPU.
     
  12. aljowen

    aljowen
    Expand Collapse

    Joined:
    Oct 21, 2012
    Messages:
    1,677
    Presumably its not a good look for Nvidia if users start complaining that their $5 350w PSU's are combusting.

    Seasonic are a very good high end brand of PSU's, yours is probably market leading in terms of efficiency since it doesn't need a fan (might even be a higher wattage PSU under-run to get fanless temps [with appropriately lowered cutoffs for safety ofc]).

    So that PSU should handle any single gaming GPU you can throw at it, with any gaming grade CPU.


    I *think* BeamNG *should* be runnable at 4k with *potentially* even a GTX1070, as long as you adjust graphics settings (and expectations) appropriately. However, consumer 4k is a direct multiple of 1080p, meaning 1080p content will look lovely on a 4k screen regardless.
     
    #7932 aljowen, Jul 22, 2018
    Last edited: Jul 22, 2018
    • Like Like x 1
    • Agree Agree x 1
  13. fufsgfen

    fufsgfen
    Expand Collapse

    Joined:
    Jan 10, 2017
    Messages:
    6,781
    I wonder how that single core usage scales up with such resolution hike though, because if I adjust graphics settings and resolution, that is limitation I face in Beam that I can't go around. Generally that single core usage is from number of object, but also related to colmesh to some extend, shadows used to cause lot of single core load with maps that had lot of objects, or at least shadows and shader gfx settings did help with that load.

    What I don't know is if more stuff on screen is going to make it much worse and how LOD handling will be dealing with that load. I can only presume single core load goes up a bit too and might become limiting factor, but then again devs might eventually do something about that, especially if they move to new T3D version with PBR, but that probably will not happen within 2 years (completely random guess).

    Still my 4K dreams of decent screen size are something that might take that 2 years to realize, but it would be silly to upgrade only little faster and need to upgrade later again, when current system does manage for now, it would make more sense to upgrade to GPU that has the grunt to do the future computing too.

    That is why I am considering 1080, but I could only get one of those lowly brands, KFA2, Zotac, Palit, Inno3D etc. and I'm bit worried of coil whine on those.

    As nothing is simple, then there is Volta coming, which might bring prices of 1070Ti and 1080 down a bit or at least bring life to 2nd hand offerings available to me as right now 2nd hand is only 10% cheaper than new, which is not the worth of the risks involved.

    That is 1-2 months wait or buy now, I guess waiting would not hurt here?
     
  14. aljowen

    aljowen
    Expand Collapse

    Joined:
    Oct 21, 2012
    Messages:
    1,677
    Lighting is done within the shader that runs on the GPU. So it shouldn't normally affect CPU usage within a game engine. Perhaps BeamNG is weird in that regard? I haven't tested.

    If it is causing increased CPU usage, perhaps it is because the GPU is struggling to keep up and the CPU is having to wait longer to send the buffers over to the GPU? Or some other similar issue to that?

    Collision meshes and visible meshes can have an effect on CPU performance though, since it is the CPU that calculates their translations/rotations and physics. LOD's are also calculated on the CPU, so could cause an increased load with higher numbers of objects (perhaps offset by reduced polygons to transform though leading to benefit?), but that shouldn't change with resolution.
     
  15. fufsgfen

    fufsgfen
    Expand Collapse

    Joined:
    Jan 10, 2017
    Messages:
    6,781
    For me it appears like if there would be some overhead per object to graphics thread, as it is CPU core that does some graphics processing that gets overloaded, but it is very much related to how many objects you have, especially meshroad objects are ones that use CPU from reasons I don't fully understand, turning shadows off makes that load go away though, so it can't be all colmeshes as shadows should not affect that.

    It is very interesting curiosity, which I have not quite understand why it is so and I have not tested current version or previous one, but I have noticed they are using more GPU than before.

    What I can't do currently are doughnuts, smoke makes my FPS to drop 20-30s if I try to do AWD doughnuts, some smoke is ok, no problem, but really burying car to smoke, that kills my FPS totally. Not sure what would happen with 1070 or faster, but I guess that too is area they will optimize at some point.

    upload_2018-7-22_21-57-35.png

    Bolide burnouts are another hard to render, but does that smoke really look nice now, quite far from the smoke what there was at 0.8!
     
  16. Bubbleawsome

    Bubbleawsome
    Expand Collapse

    Joined:
    Aug 5, 2012
    Messages:
    1,887
    Yeah, smoke in beam murders the 1050 in my laptop, but the 1070 in my desktop doesn't struggle at all. I'm pretty sure I'm not dropping under 75fps at all, but I'll go and make sure. Watch this space.
     
  17. aljowen

    aljowen
    Expand Collapse

    Joined:
    Oct 21, 2012
    Messages:
    1,677
    I haven't noticed an issue with smoke on my 970. But might be because it usually runs the game above 60 fps, so when smoke happens I probably wouldn't notice any fps drop.
     
    • Informative Informative x 1
  18. Bubbleawsome

    Bubbleawsome
    Expand Collapse

    Joined:
    Aug 5, 2012
    Messages:
    1,887
    Hm I'm getting lots of drops under 60 right now on grid map. But smoke doesn't affect them so I guess it's not that.
     
    • Informative Informative x 1
  19. fufsgfen

    fufsgfen
    Expand Collapse

    Joined:
    Jan 10, 2017
    Messages:
    6,781
    Normally smoke does not make slowdown for me either, but doing burnout against brake or where there is really lots of smoke generated, like doughnuts with Bolide or really strong motor AWD, where you get loads of smoke, it kinda goes over of some sort of tolerance and all of sudden drops to really low while being just tiny bit less smoke it stays at 60fps.

    I'm thinking it might be some bandwidth issue of GPU and thus thought that 1070 and alike might not suffer from it, thx from letting me know your experiences, it all helps to understand bit better how Beam works and what upgrade levels to consider for best gains.
     
  20. redrobin

    redrobin
    Expand Collapse

    Joined:
    Aug 21, 2012
    Messages:
    606
    Hi, GTX 1070 owner here. While I'm not 4K, I'm 5760x1080, which is essentially 3K, and it pretty much 75% of the pixels of 4K just in a funky racing-sim friendly resolution. My experience with Beam has been this:

    The menus chug like a bastard, but I think that's just a side effect of my funky resolution and the fact I run something like a little over 200 mods. Overall gameplay is pretty smooth depending on what map is selected. Mix of ultra and high settings with a normal or two mixed in with reflective textures at the highest resolution. Large maps like Desert Highway or busy maps like Insane Testing, West Coast, USA, etc. do like to drop frames when looking at high poly portions of the map. It's all very playable and a 1070 would do 4K beam just fine.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice