# Why is the completeness of UCS guaranteed only if the cost of every step exceeds some small positive constant?

Artificial Intelligence Asked by Kais Hasan on December 9, 2020

I was reading Artificial Intelligence: A Modern Approach 3rd Edition, and I have reached to the UCS algorithm.

I was reading the proof that UCS is complete.

The book state that:

Completeness is guaranteed provided the cost of every step exceeds some small positive constant $$epsilon .$$

And that’s because UCS will be stuck if there is a path with an infinite
sequence of zero-cost actions.

Why the step cost must exceed $$epsilon$$? Isn’t enough for it to be greater than zero?

Let's consider a problem where all edge costs are greater than zero, but not above some $$epsilon$$:

Image a problem where we have an infinite path where the first edge is cost $$frac{1}{2}$$, the next is $$frac{1}{4}$$, the following is $$frac{1}{8}$$, and so on forever. Every edge is greater than zero, meeting the condition being proposed in the question. However, this path overall has finite cost (1) even through there are an infinite number of states on that path. So, on this problem UCS will never reach paths with cost greater than 1. Thus, if the solution cost is 2, UCS will not find any solution to this problem, and thus it would not be a complete algorithm. So, all edges being greater than zero is not sufficient.

For most search algorithms to be complete, there must be a finite number of states with any given cost. (To be slightly more precise, there must exist some fixed $$epsilon$$ such that in each range size $$epsilon$$ there are a finite number of states.)

Correct answer by Nathan S. on December 9, 2020

## Related Questions

### Classification or regression for deep Q learning

0  Asked on December 16, 2021

### Is the Bellman equation that uses sampling weighted by the Q values (instead of max) a contraction?

0  Asked on December 16, 2021 by sirfroggy

### Why does reinforcement learning using a non-linear function approximator diverge when using strongly correlated data as input?

1  Asked on December 13, 2021

### How Graph Convolutional Neural Networks forward propagate?

1  Asked on December 13, 2021

### In which cases is the categorical cross-entropy better than the mean squared error?

3  Asked on December 11, 2021

### What are the keys and values of the attention model for the encoder and decoder in the “Attention Is All You Need” paper?

1  Asked on December 11, 2021

### Is my 57% sports betting accuracy correct?

1  Asked on December 11, 2021 by sports_stats

### Understanding the “unroling” step in the proof of the policy gradient theorem

2  Asked on December 9, 2021

### Forcing a neural network to be close to a previous model – Regularization through given model

0  Asked on December 9, 2021 by blba

### Why is DDPG not learning and it does not converge?

0  Asked on December 9, 2021 by i_al-thamary

### How artificial intelligence will change the future?

1  Asked on December 7, 2021

### Can residual neural networks use other activation functions different from ReLU?

1  Asked on December 7, 2021 by jr123456jr987654321

### Is it necessary to standardise the expected output

1  Asked on December 7, 2021

### Is CNN capable of extracting the descriptive statistics features

1  Asked on December 4, 2021 by nilsinelabore

### How to create Partially Connected NNs with prespecified connections using Tensorflow?

3  Asked on December 2, 2021 by pnar-demetci

### What is the best resources to learn Graph Convolutional Neural Networks?

2  Asked on December 2, 2021

### Is it possible to use AI to reverse engineer software?

2  Asked on November 29, 2021 by ipsumpanest

### Why do CNN’s sometimes make highly confident mistakes, and how can one combat this problem?

6  Asked on November 29, 2021

### Can you explain me this CNN architecture?

1  Asked on November 29, 2021 by sanmu

### In Deep Deterministic Policy Gradient, are all weights of the policy network updated with the same or different value?

1  Asked on November 29, 2021 by unter_983