Do powers recharge during their own arcanatimes?


 

Posted

Quote:
Originally Posted by Savos View Post
Arcanaville, do you have all your tests on perhaps a Google docs spreadsheet that is readable by others?

I'd like to see what data you generated via your tests if possible.

EDIT: Or rather what is the variance of values for your measured cycle duration for each power? That would be far more useful and something you probably already have.
I don't, but I do have it in spreadsheet form. Its all quite a quick and dirty hack, but I'll see what I can do. The way I do the measurements is very accurate, I use demorecord timing data which I've tested carefully to determine those timestamps come from the server, not the client (when I demorecord the same sequence from two different observers on two different computers in two different locations - don't ask - I get the same timestamps plus or minus infrequently one millisecond which is probably a roundoff thing - given the differences in lag and such, that seems conclusive). I do it very late at night on the test server in instances with not a lot of stuff happening. Timing variations are, for unaligned events (i.e. animation events) usually within less than 100ms.

For example, I just did a sort of my timing for spin cycles (yeah, I picked that power just so I could say that). The absolute lowest measurement was 11896ms and the highest was 12052ms, giving a range of 156ms. But those two were actually weird outliers, possibly hiccoughs on the server. Throwing just those two measurements out, out of 2571 remaining measurements (see, wasn't kidding: 8 hours 33 minutes) all others fell between 11913ms and 12022ms, a range of 109ms. 2563 of the 2573 measurements fell between 11921 and 12021 (100ms). Calculated average: 11973.47ms, almost in the middle of the two extremes (minus the two outliers).

What's interesting is that when you histogram the results, you don't end up with a single maxima. You end up with three gaussian-looking distributions centered around approximately 11944ms, 11972ms, and 12011ms. That suggests that there are reasons why the server is preferentially generating recharge cycles of approximately those beat intervals, which is on my list of things to investigate. Those gaussians are a bit rough though, even with 2500 measurements so I might need an even longer measurement run to refine the data. Maybe 48 hours if I can get it.

I'll see if I can put the data together into a form that I can post to google docs, although that might have to wait until I complete my next round of tests. The actual raw data might get ... lengthy after a while though. I had about sixty thousand measurements when the dust settled on ArcanaTime (and I'm up above twelve million random rolls on my "prove the rand() is really random" project).


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Well that just about answers it after some details fill in. The recharge time is based on server frame ticks based upon your three peaks separated by ~33 milliseconds.

Based upon this and figuring that there are 4 spots where extra frames can be put in to allow for resynchronization with combat ticks take place potentially explains the differences between theory and actual measurements.

4 spots as I see them:
- Beginning of power activation
- End of power activation
- Beginning of recharge
- End of recharge.

Between the end of power activation and the start of recharge seems the likeliest place of where extra frames get dumped though. And depending on where you are in the frame count versus the combat tick you should get presumably 4 nice peaks as there are 3.75 server ticks per combat tick, though your data suggests only 3 because of rounding.


 

Posted

Going by your stated data:

11944 ms = 358.16 frames. Since this is a discrete system, I'd expect that your data points don't really have Gaussian distribution, and that they more likely have several minimum values, each with an exponential, Poisson or some other decay function following it. So your minimum value is the "zero" and has the largest probability. As you go higher, that probability falls off until you hit the next discrete step, where it again goes to maximum.

11944 ms should snap to 358 frames
11972 ms should snap to 359 frames
12011 ms should snap to 360 frames

Going to the original data posted in the powers, 2.5 seconds for animation, 9.2 for recharge.

2.5s is a nice value because the combat clock is exactly in synchronization with frame clock in that case. 2.s5 = 20 combat ticks, 75 frames. 9.2s = 73.6 combat ticks, 276 frames.

Of the expected 358 minimum total, 351 are accounted for. The remaining deltas of 7, 8 and 9 frames reek suspiciously of trying to resynchronize the combat tick with frame ticks at some point during the activate/finish/beginrecharge/endrecharge sequence. 7.5 frames correspond to exactly 2 combat ticks. One presumed to be at the end of activation based on initial analysis. The second in some other unknown position.

I can come up with a theory for 6 of the mystery frames like so (forgive formatting):

A + Animation + B + Recharge + C

Animation = 20 combat, 75 frame
Recharge = 73.6 combat, 276 frame

A probably is zero
B likely needs to resynchronize the clocks after 1 combat tick of wait time to indicate next state ready
C likely needs to resynchronize the clocks again to let combat engine know that the power is ready.

Frames: 0 + 75 + 3.75 + 276 + 2 = 356.75 (~357 -> 11.9s)
Combat: 0 + 20 + 1 + 73.6 + 0.4 = 95 (95 -> 11.875s)

Need to find that last frame. 3.75 is assumed to be rounded to 4 in the 11.9s resulting time. Likewise, the 0.4 at the end of combat is assumed to be the remainder adding to create an integer number of combat ticks.


 

Posted

Quote:
Originally Posted by Savos View Post
Going by your stated data:

11944 ms = 358.16 frames. Since this is a discrete system, I'd expect that your data points don't really have Gaussian distribution, and that they more likely have several minimum values, each with an exponential, Poisson or some other decay function following it.
Code:
Bin    Frequency
11896.00    1
11913.00    1
11917.00    2
11919.00    2
11920.00    1
11921.00    5
11922.00    1
11923.00    2
11924.00    2
11925.00    1
11926.00    5
11927.00    9
11928.00    10
11929.00    4
11930.00    9
11931.00    9
11932.00    9
11933.00    7
11934.00    12
11935.00    20
11936.00    16
11937.00    28
11938.00    27
11939.00    29
11940.00    31
11941.00    36
11942.00    43
11943.00    48
11944.00    60
11945.00    52
11946.00    41
11947.00    41
11948.00    31
11949.00    31
11950.00    11
11951.00    17
11952.00    10
11953.00    9
11954.00    3
11955.00    6
11956.00    5
11957.00    4
11958.00    5
11959.00    13
11960.00    19
11961.00    14
11962.00    21
11963.00    29
11964.00    42
11965.00    37
11966.00    55
11967.00    79
11968.00    58
11969.00    57
11970.00    72
11971.00    90
11972.00    75
11973.00    72
11974.00    63
11975.00    72
11976.00    75
11977.00    44
11978.00    43
11979.00    35
11980.00    41
11981.00    36
11982.00    17
11983.00    14
11984.00    12
11985.00    10
11986.00    7
11987.00    6
11988.00    3
11989.00    3
11990.00    1
11991.00    7
11992.00    2
11993.00    5
11994.00    7
11995.00    6
11996.00    6
11997.00    6
11998.00    7
11999.00    10
12000.00    3
12001.00    11
12002.00    8
12003.00    15
12004.00    25
12005.00    29
12006.00    32
12007.00    33
12008.00    29
12009.00    44
12010.00    56
12011.00    50
12012.00    54
12013.00    49
12014.00    45
12015.00    38
12016.00    25
12017.00    17
12018.00    21
12019.00    6
12020.00    6
12021.00    7
12022.00    1
12052.00    1
It can be sometimes a little more complex than you're thinking, because what I've noticed is that if an event occurs during a combat clock (0.125s) those events are not aligned by the game engine to occur at the start of the clock. They occur when the game engine performs them.** So in fact if two attacks are meant to go off simultaneously at t=0.125, its entirely possible and in fact probable that one of the will go off at t=1253ms and the other at t=1255ms, in the other they are processed. And that's why you'll see clock jitter around fencepost clock pulses when you time things like critter AI shotting decisions. Those are aligned to a 0.5 second clock, which means they will always occur at t = N * 500ms, plus some offset. But the offset is in effect "anchored" to the clock pulses, so when you run a long hours-long test of critter AI, you get this odd behavior where the clock pulses keep reverting to the mean rather than random walking. They *never* drift more than 499ms away from the 500ms "beacon" and always eventually return to it.

That "anchored drift" as opposed to random walk behavior is a strong indicator of (server) quantum alignment. But in other timing tests, you see random walk behavior: the timing drifts randomly higher and lower without limit. That's a case where the offsets have a non-aligned component that can cause the timing to accumulate and drift away from the fenceposts.

This is mostly academic, but it does determine if the drift error is independent or not, and that could mean a determinable difference in some kinds of attack chains (whether someone will take the time to determine if their specific attack chain is clock-aligned and can take advantage of that fact is a separate issue).


** It is this fact that causes server-side temporal lag during high density events like zone events and old school (pre-I9) Hamidon raids. During those events, recharge itself can slow down because time is slowing down, because the server has more work to do than it can complete in one server quantum. So less quantums happen per unit time. So time essentially slows down.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

So, swipe->slash-swipe.

First, I tested with this chain: swipe->slash->swipe->strike. Why? Primarily for this reason: I can only tell when an attack starts in demorecords, not when one ends. So I like to bookend an attack chain with a dummy attack, because that is the attack that will tell me how long the final attack in the chain took, which can provide valuable information.

Here's the summary information. I ran this chain fifteen times. I timed the attacks as SwipeA, Slash, SwipeB, Strike. The average execution time of each attack was:

SwipeA: 1089ms
Slash: 1629ms
SwipeB: 1082.5ms

That's at +9.6% recharge (a +3 TO). Here are the same numbers at +19.2% recharge (2 +3 TOs):

SwipeA: 1087ms
Slash: 1489ms
SwipeB: 1087ms

It might be interesting to note that Slash is now timing out actually significantly *less* than its calculated ArcanaTime (1584ms). But that's actually something I observed when testing ArcanaTime in the first place. Due to how attacks "line up" with the animation and server clocks, it can happen that an attack will *look* a little longer because it "slops" over a clock, but then when that happens usually the next attack in the chain "picks up" that time. In other words, its an aliasing issue: attack A edges a little into the next clock, which makes it seem longer but also the following attack seem shorter. And that's what we're seeing here. Note the total duration of the three attacks in the second run: 3663ms. That's *very* close to the calculated ArcanaTime of the chain of 3696ms. This clock aliasing tends to average out. You're not really seeing Swipe run slower than ArcanaTime predicts, you're really seeing SwipeA stealing some time from Slash, making it look longer and Slash look shorter.

It also means my theory takes a bullet. 9.6% recharge is not the best you can do; at higher recharge the chain speeds up, which suggests something else is going on. But this also seems inconsistent with my cycle time measurements for single attacks.

Sleepy again. Will think about it tomorrow. There's probably a simple explanation for the discrepancy, but I'm not going to find it tonight. If someone else figures it out, by all means post. In fact, I already have a hunch what it might be, but I'd rather figure it out tomorrow.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

I'm still not convinced on the average is the most useful measure for these kinds of tests.

The discrete nature of the events implies some clustering around specific frame or combat tick counts plus or minus a few ms either direction depending on where you initiate the event from the two clocks.

Now, if the server sent 32 updates per second, this would be much easier...


 

Posted

Quote:
Originally Posted by Arcanaville View Post
Maybe the problem is the beep. The beep is a client side thing, and it happens if at the instant you push the button the game client doesn't think the power is ready to execute. But if you queue the power, the power could execute at a moment in time when the server knows it can, but the client doesn't yet know you could.
This is not the case. To test, I fired up CoH while torrenting, causing my in-game ping to be 1500 on my crummy DSL. This caused a noticeable delay in the time between me pressing a power and receiving the "recharging..." text/beep, sometimes by several seconds. Shut the torrents off, 200ms ping, and the beep is nicely responsive again.

I'm also pretty sure I've seen it happen in reverse on the ITF, when the server lags so much that my long-recharge powers show on the client to be recharged but still result in a beep when I try to activate them.


 

Posted

Quote:
Originally Posted by Spruce View Post
This is not the case. To test, I fired up CoH while torrenting, causing my in-game ping to be 1500 on my crummy DSL. This caused a noticeable delay in the time between me pressing a power and receiving the "recharging..." text/beep, sometimes by several seconds. Shut the torrents off, 200ms ping, and the beep is nicely responsive again.

I'm also pretty sure I've seen it happen in reverse on the ITF, when the server lags so much that my long-recharge powers show on the client to be recharged but still result in a beep when I try to activate them.
When I test I queue powers specifically to eliminate the round trip network lag. I still get beeps, but in a different way than you describe.

Edit: and the lag you see is the difference I'm describing between the client and the server. The client tries to predict when a power will be available by its recharge as sent to it numerically by the server. However, the client always presumes 10 seconds will take 10 seconds to elapse. The server believes 10 seconds will take 80 combat ticks and 300 animation frames to elapse. When it takes two whole seconds to execute a single combat tick because the server is overloaded at a particular spot, it takes over two minutes for the server to believe 10 seconds as elapsed and a 10 second recharge power is now recharged and ready. The client doesn't know this and shows the power recharged. But it beeps because while the little button shows ready it hasn't gotten the "you can use the power now" signal.

However, my testing seems to confirm my suspicion that if you queue a power on the server, and the recharge is just fast enough for the power to become ready when its needed, the server *executes* the power and sends the "the power is ready" message to the client, but in the time it takes for the message to actually travel to the client (network one-way lag) the client *also* believes the power is recharged, believes the power is queued, *hasn't* received the ready flag yet, and beeps. In other words, the client can beep even when the power has actually already executed on the server, and the client just doesn't know this yet.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Quote:
Originally Posted by Spruce View Post
I'm also pretty sure I've seen it happen in reverse on the ITF, when the server lags so much that my long-recharge powers show on the client to be recharged but still result in a beep when I try to activate them.
I see that all the time on zombie and Rikti invasions, drives me nuts.


 

Posted

Quote:
Originally Posted by Savos View Post
I'm still not convinced on the average is the most useful measure for these kinds of tests.

The discrete nature of the events implies some clustering around specific frame or combat tick counts plus or minus a few ms either direction depending on where you initiate the event from the two clocks.

Now, if the server sent 32 updates per second, this would be much easier...
I'm not sure you understand what I'm saying. Primarily because you're repeating part of what I'm saying. Jitter is specifically the term I use to refer to offset desynchronization as opposed to random independent error, which I think you're suggesting I'm not taking into account. This is the fundamental basis of the Arcanatime theory, not averages. I'm only looking at averages to sift the data quicker and see alignment patterns. That's also why I do histograms such as the one I posted above. However, histograms only work for very large datasets, and so I don't use them all the time. Once I know the basic character of the data, I do shorter data runs to get a better handle on what is going on with multiple situations, which would take forever if I had to do multiday runs per power per situation. However, the fundamental characterization analysis always eventually gets refined with a histogram initial pattern analysis and then a frequency and discontinuity analysis.

The basic theory hinges on this principle. If the server naturally aggregates event processing into server ticks, then events will preferentially align with clock triggers. But unless the server back-timestamps events (it doesn't) there will be two kinds of blurring of that. The first is internal anchored offset and the second is interclock jitter, both of which will make the data fuzzier than you would expect.

Anchored offset first. Suppose the server is supposed to do something once per second, every second. You'd expect those events to happen thusly:

1000ms 2000ms 3000ms 4000ms 5000ms...

If you measured the interval between events, you'd get a consistent 1000ms.

But because the server takes finite amount of time to process all the stuff it has to do, it can't always do everything it needs to do in one single instant of time. So it takes a few miliseconds. This means while the server tries to do the events on that schedule, events could happen a bit later than intended:

1012ms 2031ms 3005ms 4009ms 5061ms...

Notice that if you tried to measure the interval between events, you'd now get:

1019ms 974ms 1004ms 1052ms...

Over time this would average out to 1000ms. It would have to, because ultimately those little offsets themselves have to average out, because they are in effect anchored to the 1000ms interval.

Now, this is *different* from an unaligned event. Suppose instead that this event occured every 1000ms plus or minus 50ms. In that case, at a glance you might expect to see similar data. But in this case, instead of jitter about a base frequency, we have random walk drift. Over time, the percentage variance from the base frequency would drop, but the absolute magnitude of the variance would actually increase over time - just like a random walk. Eventually, you'd see events occur 2 seconds sooner or three seconds later than predicted. Now, out of eight hours that would still be almost dead on. But in the first case, that would be impossible: the absolute magnitude variance would *always* be no more than 999ms from predicted (short of server hiccoughs which themselves can be detected in the data as abrupt single point events).

That means the difference between random offset and synchronous offsetis detectable, and also says something about predictions. In particular, it says that aligned events are far more stable and will conform to predicts much more tightly than statistically random ones.


Now, that's intraclock offset. Interclock jitter is the related phenomenon where multiple events occur in a way that their initiation is forced to be aligned with an event clock but their measured execution is not. In this case, we have powers executing in sequence on the server locked to the combat clock (actually, its a complex interplay between the combat clock and the animation clock, but not important here), but running animations that are themselves not bound by that clock and running just against the animation clock. Also, animations are typically shorter than the arcanatime of the powers.

This means that in terms of the actual powers themselves, they execute on a clock:

1000ms 2000ms 3000ms 4000ms 5000ms...

But the animations "wiggle" within those windows. Lets say the animation itself is only 900ms long. The first one can start anywhere from 1000ms to 1100ms and not intrude on the execution of the next power. And animations are only frame-aligned: if they don't execute "now" the have to start 33ms from now. So they can tend to start sometimes a frame later than normal. If I measure with demorecords, I could see this:

1033ms 2000ms 3033ms...
From my point of view, the first power took only 967ms and the second one took 1033ms. But notice that just like with intraclock offset above, it *must* average out, because this is a bit of an illusion: the powers themselves are executing in perfect 1000ms pulses, but the animations themselves are "jittering" around those activation moments.


Now, combine the two sources of measurement offset: intraclock delay and interclock jitter. You'll get the kinds of measurements I mention above. Its analyzing the harmonics within the data that first provided the Arcanatime beat frequency of 132ms. And by the way, you'd think it should be 133ms because if its a true multiple of a 1/30th second clock, 133 is closer to four such clock ticks than 132ms. Except 133 doesn't work. 132ms does, and both the harmonic analysis and explicit testing seems to show that 132 matches observations far closer than 133. In some cases 133 seems to predict entirely the wrong thing for certain powers that are right on the edge.


So, yeah. Averages are currently being used as a data sifting tool. The analysis is not based on straight averages. At some point, I'll try to figure out how to post all the raw data, and people with different methodologies can take alternate swings at it. In fact, there's still minor unsolved glitches in ArcanaTime itself, but everyone who tried to resolve them with alternate theories back when I first published that ended up with something that did far worse at predicting optimal attack chain performance. At the moment, you could say ArcanaTime makes all the right mistakes to match reality - except for recharge which we're looking at now.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

Quote:
Originally Posted by Arcanaville View Post
When I test I queue powers specifically to eliminate the round trip network lag. I still get beeps, but in a different way than you describe.

Edit: and the lag you see is the difference I'm describing between the client and the server. The client tries to predict when a power will be available by its recharge as sent to it numerically by the server. However, the client always presumes 10 seconds will take 10 seconds to elapse. The server believes 10 seconds will take 80 combat ticks and 300 animation frames to elapse. When it takes two whole seconds to execute a single combat tick because the server is overloaded at a particular spot, it takes over two minutes for the server to believe 10 seconds as elapsed and a 10 second recharge power is now recharged and ready. The client doesn't know this and shows the power recharged. But it beeps because while the little button shows ready it hasn't gotten the "you can use the power now" signal.

However, my testing seems to confirm my suspicion that if you queue a power on the server, and the recharge is just fast enough for the power to become ready when its needed, the server *executes* the power and sends the "the power is ready" message to the client, but in the time it takes for the message to actually travel to the client (network one-way lag) the client *also* believes the power is recharged, believes the power is queued, *hasn't* received the ready flag yet, and beeps. In other words, the client can beep even when the power has actually already executed on the server, and the client just doesn't know this yet.
But that's not consistent with what I was describing at all. In high-network-lag situations, there is a delay between pressing a power button and receiving a beep. What explanation is there for this other than that the client tells the server, "I want to activate power X", waits for a response, and then decides whether to play the beep? Meaning that beep vs. no beep is entirely decided by the server, and should be consistent with what the server believes. I don't think the client's timing plays into the issue at all, aside from the cosmetic decision of whether or not to dim the power button. Forgive me if I'm misunderstanding you.


 

Posted

Quote:
Originally Posted by Spruce View Post
But that's not consistent with what I was describing at all. In high-network-lag situations, there is a delay between pressing a power button and receiving a beep. What explanation is there for this other than that the client tells the server, "I want to activate power X", waits for a response, and then decides whether to play the beep? Meaning that beep vs. no beep is entirely decided by the server, and should be consistent with what the server believes. I don't think the client's timing plays into the issue at all, aside from the cosmetic decision of whether or not to dim the power button. Forgive me if I'm misunderstanding you.
It isn't consistent with what you're describing, which is sort of my point. I queue attacks when I test, so the beep I hear can't be a response to my act of selecting the attack. It has to be the result of the queued action failing and the client either being informed of that fact or failing to be informed of the power activation quickly enough. However, I'll add it to my list of things to test more carefully with a wan emulator that can more precisely induce controlled latency into my link. Attempting to flood it with traffic to create that latency is a little too difficult to control for reliable testing. I can't do that this weekend because I'm working on other things that require I not tamper with my connectivity too much, but I will try to test that later next week. Although this testing is now heading into parts unknown: testing the precise mechanics of the beep is a bit odd even for me.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

I'm going to dumb down the conversation a bit with my more pragmatic input.

The way I see it, there's a lot of conceivable possibilties to investigate, but they all come up with one of 3 results. That being the activation of a recharged power can start at the first tick after the sum of the activation time plus the recharge time. Or "that" + 1 tick. Or "that" + 2 ticks. Whatever the reason, it appears to me from trying various recharge levels, the answer is +1 tick.


I gotta make pain. I gotta make things right. I gotta stop what's comin'. 'Least I gotta try.

 

Posted

Just a short update. I've started a new batch of tests, and while they aren't complete yet one thing that seems to be suggested by the data is that, as you might expect but this isn't necessarily obvious, recharge may be bound to the animation clock. Meaning: its possible you may only be able to reduce the recharge of a power in 0.33s increments (approximately). Plus or minus about 5 milliseconds - and that plus or minus seems to be gaussian jitter, not random error - timing measurements cluster around those staircases. Also: the server seems to be capable of making one or two frame errors, and then "catching up" on the next attack. This ability to lag one power than make up the time on the next power is part of why ArcanaTime seems to work: individual powers don't always consistently clock exactly according to AT predictions, but sequences of powers tend to follow the predictions more closely than any one power does, because offsets in one power are evened out in the very next power very consistently.

For the numerically inclined, this is what I mean:

Code:
1460    1117    -4    2    -2
1456    1085    -4    1    -3
1492    1118    -3    2    -1
1526    1087    -2    1    -1
1458    1120    -4    2    -2
1526    1048    -2    0    -2
1493    1084    -3    1    -2
1492    1084    -3    1    -2
1523    1048    -2    0    -2
1496    1119    -3    2    -1
1457    1082    -4    1    -3
1524    1086    -2    1    -1
1491    1087    -3    1    -2
1493    1089    -3    1    -2
1492    1089    -3    1    -2
1491    1086    -3    1    -2
1460    1121    -4    2    -2
1457    1120    -4    2    -2
1462    1255    -4    6    2
1460    1113    -4    2    -2
1455    1083    -4    1    -3
1491    1090    -3    1    -2
1494    1087    -3    1    -2
1459    1115    -4    2    -2
1524    1048    -2    0    -2
1524    1056    -2    0    -2
1460    1115    -4    2    -2
1457    1086    -4    1    -3
1492    1083    -3    1    -2
1526    1053    -2    0    -2
1492    1089    -3    1    -2
1457    1118    -4    2    -2
1523    1050    -2    0    -2
That's 33 measurements of the timing duration of Slash followed by Swipe in milliseconds. Notice that when Swipe is low, Slash seems high, relative to their predicted times (1584ms and 1056ms respectively) In fact, the columns to the right are basically the approximate number of animation frames high or low of the predicted mark each measurement is, and the sum of the two. Notice the preponderance of "-2s" in the far right column. If the measurement errors were random, those numbers would be random. Instead, the majority of the time they just happen to sum out to -2 (24 out of 33 are -2, and one is an obvious server glitch being way out at +2). Clearly, something is happening here where the server is, deliberately or coincidentally by design, trying to keep the attack chain agreeing with ArcanaTime predictions.

That's important to note for recharge testing, because it suggests that its *possible* that the amount of recharge you need may be dependent on *where* in the attack chain the power is, and that might force us to come up with safety estimates that will always work, rather than being able to be 100% certain what the absolute minimum will be situationally (at least, without doing more work than most people will want to do).


Ok, so that wasn't all that short. For one of my testing posts, its relatively short.


PS: Now you know why I generally tend to go off for a couple months to test something and come back with just one article that explains it all when I'm done. Posting as I go along and describing everything I see as I investigate is extremely longish, and probably extremely boring.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

I know that there's a lot of things to test before anyone pegs it as the cause of the short Slashes in that test, but if it does turn out that it's dependant on position in chaisn, it seems that the powers are being considered in pairs, which would suggest that a three-attack chain would see a lot more variation - about 50% of the time, it should be one of the longer, second attacks followed by a pair, and the other 50% a pair followed by a short attack. That should produce a rather characteristic beat in, say, a Slash/Swipe/Strike attack chain.

It'd also produce a rather odd result for attack chains: Four- and six-attack chains can finesse the recharge on their attacks, but three- and five-attack chains have to slot them all for worst-case scenarios in all powers, as any gap would show up on alternate chains.


 

Posted

Quote:
Originally Posted by Arcanaville View Post
PS: Now you know why I generally tend to go off for a couple months to test something and come back with just one article that explains it all when I'm done. Posting as I go along and describing everything I see as I investigate is extremely longish, and probably extremely boring.
No it isn't, but do w/e is most convenient and least likely to interfere with your work. Thanks.


Miss Arc #147491: Rise of Bedlam
AKA Iron Smoke @Champion Server

 

Posted

Arcanaville, where did you determine the 30 frames per second value?

It certainly looks like it works in the data sets you provide, but the initial look at seeing how the game protocol looks indicates a 10 per second update rate from the client to the server for power activations, position updates and so on.

I don't have any reason to doubt 30/sec, just that the numbers from client to server don't match up nicely to 30. They are a multiple so it would work out nicely as they collate date to generate the next frame or combat tick.

I did not see an obvious pattern for server to client, it looked to be on an as needed delta basis.


 

Posted

Quote:
Originally Posted by Savos View Post
Arcanaville, where did you determine the 30 frames per second value?
I know for a fact that animations run on a 30 frame per second clock, and I've been explicitly told by the devs that for at least some things a 30/sec clock is the clock quantum. I believe the game servers support no clock faster than that (which doesn't mean they don't support events that take less time, just that it only processes such events at 30 per second maximum).

Other measurements, including the ArcanaTime ones, strongly suggested a 30/sec clock and a 8/sec clock, and pohsyb confirmed that both kinds of clocks are used in event processing, although he didn't tell me specifically the entire details of the game processing loop.

How fast the servers update the clients with network packets is probably less relevant to when things can happen, because within those updates appear to be timing information that tells the client when things *should* happen, which might be some time after the client receives the packet.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)

 

Posted

this thread is relevant to my interests. any news?

does this jitter affect the dps of a powerset like claws due to the larger number of interactions with the server in a situation such as a pylon test (over a broadsword player clicking less for instance)?

a recent excursion into recharge times, arcanatimes, and dps has me quite confused as to why the paper and pixels exhibit a significant difference in a recent claw/sr build and i'd like to learn why.

/thanks in advance.


Kittens give Morbo gas.

 

Posted

Quote:
Originally Posted by spice_weasel View Post
this thread is relevant to my interests. any news?

does this jitter affect the dps of a powerset like claws due to the larger number of interactions with the server in a situation such as a pylon test (over a broadsword player clicking less for instance)?

a recent excursion into recharge times, arcanatimes, and dps has me quite confused as to why the paper and pixels exhibit a significant difference in a recent claw/sr build and i'd like to learn why.

/thanks in advance.
A combination of work and I19 diverted my attention from this testing, but I haven't forgotten about it. However, its likely that I won't get back to completing this until after the holidays.

For now, I would say that the cautious thing to do is operate on the assumption that network lag does have an effect in the sense of delaying when the server is told to execute the next power, and thus the power must be recharged earlier than needed - by about a quarter of a second, although this is possibly lag dependent - which necessitates higher than computed recharge. I still see discrepancies based on that theory, but that theory is more conservative than necessary, not less, and will likely predict how much recharge you need, if possibly more than you need.

If you have a specific question, though, you can post your observations and I or someone else on the forum will respond if we can.


[Guide to Defense] [Scrapper Secondaries Comparison] [Archetype Popularity Analysis]

In one little corner of the universe, there's nothing more irritating than a misfile...
(Please support the best webcomic about a cosmic universal realignment by impaired angelic interference resulting in identity crisis angst. Or I release the pigmy water thieves.)