def coverPoints(self, A, B):

n = len(A)

res = 0

for i in range(1,n):

a = abs(A[i] - A[i-1])

b = abs(B[i] - B[i-1])

```
m = max(a,b)
res += m
return res
```

def coverPoints(self, A, B):

n = len(A)

res = 0

for i in range(1,n):

a = abs(A[i] - A[i-1])

b = abs(B[i] - B[i-1])

```
m = max(a,b)
res += m
return res
```

Why itâ€™s A[i] - A[i-1] and B[i] - B[i-1] ?

Coming from https://www.geeksforgeeks.org/minimum-steps-needed-to-cover-a-sequence-of-points-on-an-infinite-grid/

Please help.