Performance is a function that measures the quality of the actions the agent did.
The environment in which the agent operates. They are the described with the following main properties:
If an agent's sensory apparatus gives it access to the complete state of the environment, then we say that the environment is accessible to that agent. An environment is effectively accessible if the sensors detect all aspects that are relevant to the choice of action. An accessible environment is convenient because the agent need not maintain any internal state to keep track of the world.
If the next state of the environment is completely determined by the current state of the actions selected by the agents, then we say the environment is deterministic. In principle, an agent need not worry about uncertainty in an accessible, deterministic environment.
In an episodic environment, the agent's experience is divided into "episodes". Each episode consists of the agent perceiving and then acting. The quality of its action depends just on the episode itself, because subsequent episodes do not depend on what actions occur in previous episodes. Episodic environments are much simpler because the agent does not need to think ahead.
If the environment can change while an agent is deliberating, then we say the environment is dynamic for that agent; otherwise it is static. Static environments are easy to deal with because the agent need not keep looking at the world while it is deciding on an action, nor need it worry about the passage of time. If the environment does not change with the passage of time but the agent's performance score does, then we say the environment is semidynamic.
If there are a limited number of distinct, clearly defined percepts and actions we say that the environment is discrete. Chess is discrete - there are a fixed number of possible moves on each turn. Taxi driving is continuous - the speed and location of the taxi and the other vehicles sweep through a range of continuous values.
In a multi-agent environment on the other hand, agent's need to account for the actions of other agents. In particular, if the other agents are directly in competition with each other, it is said that the environment is competitive, whereas if the agents are existing in unity, it is said that the environment is cooperative. Note that the two qualities are not mutually exclusive, and an environment can be both competitive and cooperative to different degrees.
Example of a cooperative environment would be for example, a G-rated driving game - none of the agents in the world (usually) would be interested in crashing into each other. Of course in a game such as destruction-derby, agents are interested in crashing into one another, and the environment leans heavily against the competitive side.
Sensors allow the agent to collect the percept sequence that will be used for deliberating on the next action.
PEAS PLEASE THERE'S MORE TO THESE SPRING DELIGHTS THAN WHAT'S IN THE POD: THE ENTIRE PLANT OFFERS A VARIETY OF FRESH GREEN EDIBLES.(Life and Arts)
Jun 12, 2002; Byline: SARA DICKERMAN Special to the P-I Like good spring weather in the Northwest, local PEAS always arrive in the...