Học tiếng Anh hiệu quả, nhanh chóng: http://www.facebook.com/HocTiengAnhVOA, http://www.voatiengviet.com/section/hoc-tieng-anh/2693.html. Nếu không vào được VOA, xin hãy vào http://vn3000.com để vượt tường lửa. Các chương trình học tiếng Anh miễn phí của VOA (VOA Learning English for Vietnamese) có thể giúp bạn cải tiến kỹ năng nghe và phát âm, hiểu rõ cấu trúc ngữ pháp, và sử dụng Anh ngữ một cách chính xác. Xem thêm: http://www.facebook.com/VOATiengViet
Luyện nghe nói và học từ vựng tiếng Anh qua video. Xem các bài học kế tiếp: https://www.youtube.com/playlist?list=PLD7C5CB40C5FF0531
Tay is an Artificial Intelligence or AI developed by Microsoft and Bing researchers. The robot, chatbot or bot is able to copy human behavior and talks and writes. Microsoft launched Tay online in March. But 16 hours later the company had to remove the bot from the web. Tay had very quickly learned some extremely insulting language. Microsoft said Tay was developed for use by 18 to 24-year-olds in the U.S. The company posted information about Tay before it went online. The post said that the more people talk to Tay, the smarter the bot would become. Developers hoped Tay would provide funny answers to tweets and other messages from the apps Kik, GroupMe and Twitter. But many of the messages Tay received used offensive language. Soon, the bot was using that language in its own posts. Tay’s posts included sexist and racist messages. Some of the bot’s tweets got so bad that one newspaper called Tay the “Hitler-loving sex robot.” Kris Hammond is a computer scientist and professor. He told the Associated Press that he could not believe Tay’s developers did not expect such problems would occur. Tay tweeted roughly 96,000 times before Microsoft took it down. In a blog post, Microsoft took full responsibility for not predicting the problems. They also said they are working on Tay again to try fix it. They are hopeful the bot will return to the online world.