George Bernard Shaw(1856—1950)was born in Dublin, Ireland. At the age of 14, after graduating from middle school, Shaw was put into a job as clerk in a land agent’s office. At 20 he went to London where he remained jobless for 9 years, devoting much time to self-education. Meantime, Shaw took an active part in the socialist movement. A contemporary of Shaw’s thus wrote of him: "I used to be a daily frequenter of the British Museum Reading Room. Even more assiduous in his attendance was a young man. ...My curiosity was piqued by the odd conjunction of his subjects of research. Day after day for weeks he had before him two books—Karl Marx’s ’Das Kapital’ (in French), and an orchestral score(管弦乐乐谱) of ’Tristan and Isolde’."
Though Shaw admitted Marx’s great influence on him, he failed to grasp the necessity of a revolutionary reconstruction of the world. A strong influence was exercised on Shaw by the Fabian society, the English reformist organization.
In the early period of his literary career, Shaw wrote some novels, "An Unsocial Socialist" and others, in which he developed the traditions of critical realism, bitterly criticizing the stupidity, snobbishness and petty tyranny of the middle class. In the nineties Shaw turned to the theatre, first working as a dramatic critic, then writing plays for the stage. His role in the development of dramaturgy is very great. Shaw was an enemy of "art for art’s sake". He wrote, "for art’s sake I will not face the toil of writing a sentence." He used the stage to criticize the evils of capitalism. He wrote 51 plays in total, the important ones including "Widower’s Houses", "Saint Joan" and "The Apple Cart". In his plays Shaw laid bare the gross injustice and utter inhumanity of the bourgeois society. This he achieved not so much by the structures of plots in his plays as by the brilliant dialogues between the characters. His exposure of capitalist society is very significant and it places Shaw among the most important representatives of critical realism in modern English literature.
In his plays, Shaw achieved the exposure of capitalist society ______.
A:by either the structures of plots or the brilliant dialogues between the characters B:by the brilliant dialogues between the characters better than by the structures of plots C:by both the brilliant plots and dialogues equally D:by the brilliant dialogues between the characters instead of by the structures of plots
Good looks, the video-games industry is discovering, will get you only so far. The graphics on a modern game may far outstrip the pixellated blobs of the 1980s, but there is more to a good game than eye candy. Photo-realistic graphics make the lack of authenticity of other aspects of gameplay more apparent. It is not enough for game characters to look better—their behaviour must also be more sophisticated, say researchers working at the interface between gaming and artificial intelligence(AI).
Today’ s games may look better, but the gameplay is"basically the same" as it was a few years ago, says Michael Mateas, the founder of the Experimental Game Lab at the Georgia Institute of Technology. AI, he suggests, offers an" untapped frontier" of new possibilities. "We are topping out on the graphics, so what’ s going to be the next thing that improves gameplay" asks John Laird, director of the A1 lab at the University of Michigan. Improved Al is a big part of the answer, he says. Those in the industry agree. The high-definition graphics possible on next-generation games consoles, such as Microsoft’ s Xbox 360, are raising expectatious across the board, says Neff Young of Electronic Arts, the world’ s biggest games publisher. "You have to have high-resolution models, which requires high-resolution animation," he says," so now I expect high-resolution behaviour."
Representatives from industry and academia will converge in Marina del Rey, California, later this month for the second annual Artificial Intelligence and Interactive Digital Entertainment(AIIDE ) conference. The aim, says Dr Laird, who will chair the event, is to Increase the traffic of people and ideas between the two spheres. "Games have been very important to AI through the years," he notes. Alan Turing, one of the pioneers of computing in the 1940s, wrote a simple chess-playing program before there were any computers to run it on; he also proposed the Turing test, a question-and-answer game that is a yardstick for machine intelligence. Even so ,AI research and video games existed in separate worlds until recently. The Al techniques used in games were very simplistic from an academic perspective, says Dr. Mateas, while Al researchers were, in turn, clueless about modern games. But, he says, "both sides are learning, and are now much closer."
Consider, for example, the software that controls an enemy in a first-person shooter (FPS) —a game in which the player views the world along the barrel of a gun. The behaviour of enemies used to be pre-scripted: wait until the player is nearby, pop up from behind a box, fire weapon, and then roll and hide behind another box, for example. But some games now use far more advanced" planning systems" imported from academia. "Instead of scripts and hand-coded behaviour, the AI monsters in an FPS can reason from first principles," says Dr. Mateas. They can, for example, work out whether the player can see them or not, seek out cover when injured, and so on. "Rather than just moving between predefined spots, the characters in a war game can dynamically shift, depending on what’s happening," says Fiona Sperry of Electronic Arts.
If the industry is borrowing ideas from academia, the opposite is also true. Commercial games such as "Unreal Tournament", which can be easily modified or scripted, are being adopted as research tools in universities, says Dr. Laird. Such tools provide flexible environments for experiments, and also mean that students end up with transferable skills.
But the greatest potential lies in combining research with game development, argues Dr. Mateas. "Only by wrestling with real content are the technical problems revealed, and only by wrestling with technology does it give you insight into what new kinds of content are possible, "he says.
According to the passage, good video-games used to be judged in terms of
A:how sophisticated the behaviors of the characters are. B:how good-looking the characters seem to be. C:how sophisticated the artificial intelligence is. D:how much authenticity is displayed in the characters.
Good looks ,the video-games industry is discovering ,will get you only so far. The graphics on a modern game may far outstrip the pixellated blobs of the 1980s, but there is more to a good game than eye candy. Photo-realistic graphics make the lack of authenticity of other aspects of gameplay more apparent. It is not enough for game characters to look better—their behaviour must also be more sophisticated, say researchers working at the interface between gaming and artificial intelligence(AI).
Today’ s games may look better, but the gameplay is " basically the same" as it was a few years ago, says Michael Mateas, the founder of the Experimental Game Lab at the Georgia Institute of Technology. AI, he suggests, offers an" untapped frontier" of new possibilities. "We are topping out on the graphics, so what’s going to be the next thing that improves game-play" asks John Laird, director of the AI lab at the University of Michigan. Improved AI is a big part of the answer, he says. Those in the industry agree. The high-definition graphics possible on next-generation games consoles, such as Microsoft’ s Xbox 360, are raising expectations across the board, says Neil Young of Electronic Arts, the world’ s biggest games publisher. "You have to have high-resolution models, which requires high-resolution animation," he says," so now I expect high-resolution behaviour."
Representatives from industry and academia will converge in Marina del Rey, California, later this month for the second annual Artificial Intelligence and Interactive Digital Entertainment(AIIDE) conference. The aim, says Dr Laird, who will chair the event, is to increase the traffic of people and ideas between the two spheres. "Games have been very important to AI through the years," he notes. Alan Turing, one of the pioneers of computing in the 1940s, wrote a simple chess-playing program before there were any computers to run it on; he also proposed the Turing test, a question-and-answer game that is a yardstick for machine intelligence. Even so, AI research and video games existed in separate worlds until recently. The AI techniques used in games were very simplistic from an academic perspective, says Dr. Mateas, while AI researchers were, in turn, clueless about modern games. But, he says, " both sides are learning, and are now much closer."
Consider, for example, the software that controls an enemy in a first-person shooter (FPS)—a game in which the player views the world along the barrel of a gun. The behaviour of enemies used to be pre-scripted: wait until the player is nearby, pop up from behind a box, fire weapon, and then roll and hide behind another box, for example. But some games now use far more advanced" planning systems" imported from academia. "Instead of scripts and hand-coded behaviour, the AI monsters in an FPS can reason from first principles, "says Dr. Mateas. They can, for example, work out whether the player can see them or not, seek out cover when injured, and so on. "Rather than just moving between predefined spots, the characters in a war game can dynamically shift, depending on what’ s happening," says Fiona Sperry of Electronic Arts.
If the industry is borrowing ideas from academia, the opposite is also true. Commercial games such as" Unreal Tournament", which can be easily modified or scripted, are being adopted as research tools in universities, says Dr. Laird. Such tools provide flexible environments for experiments, and also mean that students end up with transferable skills.
But the greatest potential lies in combining research with game development, argues Dr. Mateas. "Only by wrestling with real content are the technical problems revealed, and only by wrestling with technology does it give you insight into what new kinds of content are possible, "he says.
According to the passage, good video-games used to be judged in terms of
A:how sophisticated the behaviors of the characters are. B:how good-looking the characters seem to be. C:how sophisticated the artificial intelligence is. D:how much authenticity is displayed in the characters.
Good looks, the video-games industry is discovering, will get you only so far. The graphics on a modern game may far outstrip the pixellated blobs of the 1980s, but there is more to a good game than eye candy. Photo-realistic graphics make the lack of authenticity of other aspects of gameplay more apparent. It is not enough for game characters to look better--their behaviour must also be more sophisticated, say researchers working at the interface between gaming and artificial intelligence (AI).
Today’s games may look better, but the gameplay is" basically the same" as it was a few years ago, says Michael Mateas, the founder of the Experimental Game Lab at the Georgia Institute of Technology. AI, he suggests, offers an "untapped frontier" of new possibilities. "We are topping out on the graphics, so what’s going to be the next thing that improves gameplay" asks John Laird, director of the AI lab at the University of Michigan. Improved AI is a big part of the answer, he says. Those in the industry agree. The high-definition graphics possible on next-generation games consoles, such as Microsoft’s Xbox 360, are raising expectations across the board, says Neil Young of Electronic Arts, the world’s biggest games publisher. " You have to have high-resolution models, which requires high-resolution animation," he says," so now I expect high-resolution behaviour. "
Representatives from industry and academia will converge in Marina del Rey, California, later this month for the second annual Artificial Intelligence and Interactive Digital Entertainment(AIIDE) conference. The aim, says Dr Laird, who will chair the event, is to increase the traffic of people and ideas between the two spheres. "Games have been very important to AI through the years, "he notes. Alan Turing, one of the pioneers of computing in the 1940s,wrote a simple chess-playing program before there were any computers to run it on; he also proposed the Turing test, a question-and-answer game that is a yardstick for machine intelligence. Even so, AI research and video games existed in separate worlds until recently. The AI techniques used in games were very simplistic from an academic perspective, says Dr. Mateas, while AI researchers were, in turn, clueless about modern games. But, he says," both sides are learning, and are now much closer. "
Consider, for example, the software that controls an enemy in a first-person shooter (FPS)--a game in which the player views the world along the barrel of a gun. The behaviour of enemies used to be pre-scripted: wait until the player is nearby, pop up from behind a box, fire weapon, and then roll and hide behind another box, for example. But some games now use far more advanced "planning systems" imported from academia. "Instead of scripts and hand-coded behaviour, the AI monsters in an FPS can reason from first principles," says Dr. Mateas. They can, for example, work out whether the player can see them or not, seek out cover when injured, and so on. " Rather than just moving between predefined spots, the characters in a war game can dynamically shift, depending on what’s happening," says Fiona Sperry of Electronic Arts.
If the industry is borrowing ideas from academia, the opposite is also true. Commercial games such as" Unreal Tournament", which can be easily modified or scripted, are being adopted as research tools in universities, says Dr. Laird. Such tools provide flexible environments for experiments, and also mean that students end up with transferable skills.
But the greatest potential lies in combining research with game development, argues Dr. Mateas. "Only by wrestling with real content are the technical problems revealed, and only by wrestling with technology does it give you insight into what new kinds of content are possible, "he says.
According to the passage, good video-games used to be judged in terms of
A:how sophisticated the behaviors of the characters are. B:how good-looking the characters seem to be. C:how sophisticated the artificial intelligence is. D:how much authenticity is displayed in the characters.
Good looks ,the video-games industry is discovering ,will get you only so far. The graphics on a modern game may far outstrip the pixellated blobs of the 1980s, but there is more to a good game than eye candy. Photo-realistic graphics make the lack of authenticity of other aspects of gameplay more apparent. It is not enough for game characters to look better—their behaviour must also be more sophisticated, say researchers working at the interface between gaming and artificial intelligence(AI).
Today’ s games may look better, but the gameplay is " basically the same" as it was a few years ago, says Michael Mateas, the founder of the Experimental Game Lab at the Georgia Institute of Technology. AI, he suggests, offers an" untapped frontier" of new possibilities. "We are topping out on the graphics, so what’s going to be the next thing that improves game-play" asks John Laird, director of the AI lab at the University of Michigan. Improved AI is a big part of the answer, he says. Those in the industry agree. The high-definition graphics possible on next-generation games consoles, such as Microsoft’ s Xbox 360, are raising expectations across the board, says Neil Young of Electronic Arts, the world’ s biggest games publisher. "You have to have high-resolution models, which requires high-resolution animation," he says," so now I expect high-resolution behaviour."
Representatives from industry and academia will converge in Marina del Rey, California, later this month for the second annual Artificial Intelligence and Interactive Digital Entertainment(AIIDE) conference. The aim, says Dr Laird, who will chair the event, is to increase the traffic of people and ideas between the two spheres. "Games have been very important to AI through the years," he notes. Alan Turing, one of the pioneers of computing in the 1940s, wrote a simple chess-playing program before there were any computers to run it on; he also proposed the Turing test, a question-and-answer game that is a yardstick for machine intelligence. Even so, AI research and video games existed in separate worlds until recently. The AI techniques used in games were very simplistic from an academic perspective, says Dr. Mateas, while AI researchers were, in turn, clueless about modern games. But, he says, " both sides are learning, and are now much closer."
Consider, for example, the software that controls an enemy in a first-person shooter (FPS)—a game in which the player views the world along the barrel of a gun. The behaviour of enemies used to be pre-scripted: wait until the player is nearby, pop up from behind a box, fire weapon, and then roll and hide behind another box, for example. But some games now use far more advanced" planning systems" imported from academia. "Instead of scripts and hand-coded behaviour, the AI monsters in an FPS can reason from first principles, "says Dr. Mateas. They can, for example, work out whether the player can see them or not, seek out cover when injured, and so on. "Rather than just moving between predefined spots, the characters in a war game can dynamically shift, depending on what’ s happening," says Fiona Sperry of Electronic Arts.
If the industry is borrowing ideas from academia, the opposite is also true. Commercial games such as" Unreal Tournament", which can be easily modified or scripted, are being adopted as research tools in universities, says Dr. Laird. Such tools provide flexible environments for experiments, and also mean that students end up with transferable skills.
But the greatest potential lies in combining research with game development, argues Dr. Mateas. "Only by wrestling with real content are the technical problems revealed, and only by wrestling with technology does it give you insight into what new kinds of content are possible, "he says.
A:how sophisticated the behaviors of the characters are. B:how good-looking the characters seem to be. C:how sophisticated the artificial intelligence is. D:how much authenticity is displayed in the characters.
Text 2
Good looks, the video-games industry is
discovering, will get you only so far. The graphics on a modern game may far
outstrip the pixellated blobs of the 1980s, but there is more to a good game
than eye candy. Photo-realistic graphics make the lack of authenticity of other
aspects of gameplay more apparent. It is not enough for game characters to look
better—their behaviour must also be more sophisticated, say researchers working
at the interface between gaming and artificial intelligence(AI). Today’ s games may look better, but the gameplay is"basically the same" as it was a few years ago, says Michael Mateas, the founder of the Experimental Game Lab at the Georgia Institute of Technology. AI, he suggests, offers an" untapped frontier" of new possibilities. "We are topping out on the graphics, so what’ s going to be the next thing that improves gameplay" asks John Laird, director of the A1 lab at the University of Michigan. Improved Al is a big part of the answer, he says. Those in the industry agree. The high-definition graphics possible on next-generation games consoles, such as Microsoft’ s Xbox 360, are raising expectatious across the board, says Neff Young of Electronic Arts, the world’ s biggest games publisher. "You have to have high-resolution models, which requires high-resolution animation," he says," so now I expect high-resolution behaviour." Representatives from industry and academia will converge in Marina del Rey, California, later this month for the second annual Artificial Intelligence and Interactive Digital Entertainment(AIIDE ) conference. The aim, says Dr Laird, who will chair the event, is to Increase the traffic of people and ideas between the two spheres. "Games have been very important to AI through the years," he notes. Alan Turing, one of the pioneers of computing in the 1940s, wrote a simple chess-playing program before there were any computers to run it on; he also proposed the Turing test, a question-and-answer game that is a yardstick for machine intelligence. Even so ,AI research and video games existed in separate worlds until recently. The Al techniques used in games were very simplistic from an academic perspective, says Dr. Mateas, while Al researchers were, in turn, clueless about modern games. But, he says, "both sides are learning, and are now much closer." Consider, for example, the software that controls an enemy in a first-person shooter (FPS) —a game in which the player views the world along the barrel of a gun. The behaviour of enemies used to be pre-scripted: wait until the player is nearby, pop up from behind a box, fire weapon, and then roll and hide behind another box, for example. But some games now use far more advanced" planning systems" imported from academia. "Instead of scripts and hand-coded behaviour, the AI monsters in an FPS can reason from first principles," says Dr. Mateas. They can, for example, work out whether the player can see them or not, seek out cover when injured, and so on. "Rather than just moving between predefined spots, the characters in a war game can dynamically shift, depending on what’s happening," says Fiona Sperry of Electronic Arts. If the industry is borrowing ideas from academia, the opposite is also true. Commercial games such as "Unreal Tournament", which can be easily modified or scripted, are being adopted as research tools in universities, says Dr. Laird. Such tools provide flexible environments for experiments, and also mean that students end up with transferable skills. But the greatest potential lies in combining research with game development, argues Dr. Mateas. "Only by wrestling with real content are the technical problems revealed, and only by wrestling with technology does it give you insight into what new kinds of content are possible, "he says. |
A:how sophisticated the behaviors of the characters are. B:how good-looking the characters seem to be. C:how sophisticated the artificial intelligence is. D:how much authenticity is displayed in the characters.
Good looks ,the video-games industry is discovering ,will get you only so far. The graphics on a modern game may far outstrip the pixellated blobs of the 1980s, but there is more to a good game than eye candy. Photo-realistic graphics make the lack of authenticity of other aspects of gameplay more apparent. It is not enough for game characters to look better—their behaviour must also be more sophisticated, say researchers working at the interface between gaming and artificial intelligence(AI).
Today’ s games may look better, but the gameplay is " basically the same" as it was a few years ago, says Michael Mateas, the founder of the Experimental Game Lab at the Georgia Institute of Technology. AI, he suggests, offers an" untapped frontier" of new possibilities. "We are topping out on the graphics, so what’s going to be the next thing that improves game-play" asks John Laird, director of the AI lab at the University of Michigan. Improved AI is a big part of the answer, he says. Those in the industry agree. The high-definition graphics possible on next-generation games consoles, such as Microsoft’ s Xbox 360, are raising expectations across the board, says Neil Young of Electronic Arts, the world’ s biggest games publisher. "You have to have high-resolution models, which requires high-resolution animation," he says," so now I expect high-resolution behaviour."
Representatives from industry and academia will converge in Marina del Rey, California, later this month for the second annual Artificial Intelligence and Interactive Digital Entertainment(AIIDE) conference. The aim, says Dr Laird, who will chair the event, is to increase the traffic of people and ideas between the two spheres. "Games have been very important to AI through the years," he notes. Alan Turing, one of the pioneers of computing in the 1940s, wrote a simple chess-playing program before there were any computers to run it on; he also proposed the Turing test, a question-and-answer game that is a yardstick for machine intelligence. Even so, AI research and video games existed in separate worlds until recently. The AI techniques used in games were very simplistic from an academic perspective, says Dr. Mateas, while AI researchers were, in turn, clueless about modern games. But, he says, " both sides are learning, and are now much closer."
Consider, for example, the software that controls an enemy in a first-person shooter (FPS)—a game in which the player views the world along the barrel of a gun. The behaviour of enemies used to be pre-scripted: wait until the player is nearby, pop up from behind a box, fire weapon, and then roll and hide behind another box, for example. But some games now use far more advanced" planning systems" imported from academia. "Instead of scripts and hand-coded behaviour, the AI monsters in an FPS can reason from first principles, "says Dr. Mateas. They can, for example, work out whether the player can see them or not, seek out cover when injured, and so on. "Rather than just moving between predefined spots, the characters in a war game can dynamically shift, depending on what’ s happening," says Fiona Sperry of Electronic Arts.
If the industry is borrowing ideas from academia, the opposite is also true. Commercial games such as" Unreal Tournament", which can be easily modified or scripted, are being adopted as research tools in universities, says Dr. Laird. Such tools provide flexible environments for experiments, and also mean that students end up with transferable skills.
But the greatest potential lies in combining research with game development, argues Dr. Mateas. "Only by wrestling with real content are the technical problems revealed, and only by wrestling with technology does it give you insight into what new kinds of content are possible, "he says.
A:how sophisticated the behaviors of the characters are B:how good-looking the characters seem to be C:how sophisticated the artificial intelligence is D:how much authenticity is displayed in the characters
Passage One
Stories don’t just happen ; they are created. There are no stories in the everyday course of events; there are only the ingredients for stories. A dozen people may watch a man standing on the fifth-floor ledge or a small child crying. There is no story involved in either case unless one of the dozen chooses to make one up—to surround the isolated event with a beginning and an end, thereby giving what we call a meaning to human action. In other words, there has to be a story-maker--a story-teller--if there is to be a story.
You as the story-maker or writer are in complete control of all of the details of your story. You have control over who the characters are, what they do, and why they do it. You also have control over how the story is to be told and who is going to tell it. You can adopt one of a number of points of view, each of which will give a quite different total story.
Broadly speaking, there are two major approaches a writer can take: (1) you can present the story as if told by someone who is completely outside it, or (2) you can present the story as if told by one of its characters. In either case, the teller’s role is an assumed role.
Who controls all the details of the story
A:The story-maker. B:The characters. C:The publisher. D:The proofreader.
您可能感兴趣的题目