Abstract

In this study, we investigate the use of machine learning and text mining techniques to classify and analyze TED talk videos based on their transcript data. We perform sentiment analysis to identify the opinions and feelings expressed by TED speakers about each talk topic, and use topic analysis to cluster the videos and compare them to the categories labeled by the TED website. We also apply text classification techniques to predict the topics of new videos using a random forest model. Our results show that TED talks tend to present a positive sentiment and that the clusters generated by Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) align closely with the known categories. The supervised learning model using a combination of LSA on Term Frequency-Inverse Document Frequency (TF-IDF) and additional information had the highest accuracy, with an overall accuracy of 0.90. Limitations and potential directions for future research are also discussed.

1. Introduction

There are several ways to learn and share knowledge today, one of the most popular being short video clips. We are interested in using both machine learning and text mining techniques to analyze the classification of videos from their transcript or subtitle data. Initially, we considered several streaming websites and video podcasts, such as YouTube, BBC Learning English, Apple podcasts, and TED talk. TED was ultimately chosen for this project because of the availability of data for our study and its wide range of videos in terms of topics, languages, and lengths. Furthermore, each TED video is labeled with a relevant category and includes a transcript.

The goal of this project is to first use sentiment analysis to identify opinions, judgments, or feelings expressed by TED speakers about each TED talk topic. Second, we will use topic analysis to cluster the videos and compare them to the categories labeled by the TED website. Finally, we will apply text classification techniques to predict the topics of new videos.

The rest of the project is organized as follows: Section 2 describes the data and web scraping, Section 3 presents tokenization, Section 4 presents exploratory data analysis, Section 5 presents sentiment analysis, Section 6 performs topic modeling analysis, Section 7 performs embedding analysis, and Section 8 performs supervised analysis. The main results of the study, as well as limitations and potential future research, are presented in Section 9.

2. Data Preparation

To acquire the transcript text from TED talk videos, we implemented a web scraping procedure from TED website, with the following steps:

  • Use RSelenium package open TED website.
  • Go to TED Talks session by clicking the navigation bar button.
  • Select language, topics, and the sort by to specify the range of video types.
# go to TED talk
drop_down <- remDr$findElement(using = 'xpath', '//*[@id="menu-button--0"]/div')
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$findElement(using = 'xpath', '//*[@id="option-0--0"]/div[1]')$clickElement()
Sys.sleep(2)

# select language to English
drop_down <- remDr$findElement(using = 'xpath', '//*[@id="languages"]')
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$click(2)
remDr$findElement(using = "xpath", '//*[@id="languages"]/optgroup/option[1]')$clickElement()
Sys.sleep(1)

# select topic (take climate change as an example)
drop_down <- remDr$findElement(using = 'xpath', '//*[@id="topics"]')
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$click(2)
remDr$findElement(using = "xpath", '//*[@id="topics"]/option[3]')$clickElement()
remDr$findElement(using = "xpath", '/html/body/div[4]/div[2]/div/div/div/div[3]/ul[1]/li[3]/a')$clickElement()
# if want to change the capital of title, change the number in: [?]/a of the xpath
topic <- remDr$findElement(using = "partial link text", 'Climate change')  ## put topic name here
remDr$mouseMoveToLocation(webElement=topic)
remDr$click(2)
Sys.sleep(1)

# select sort by the most relevant
drop_down <- remDr$findElement(using = 'xpath', '//*[@id="filters-sort"]')
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$click(2)
remDr$findElement(using = "xpath", '//*[@id="filters-sort"]/optgroup/option[2]')$clickElement()
  • Due to the constantly changing structure of the TED website, we encountered difficulties when attempting to directly scrape data from the individual video pages. As a result, we implemented a workaround by first scraping the video titles from the browser result page after completing the third step of our web scraping procedure. This resulted in a data frame containing the names of all the videos that we wanted to further scrape for data.
# first crawl all videos' titles on the first page
html_page <- remDr$getPageSource()[[1]]
page <- 3
title <- as.character()
speaker <- as.character()
views_times <- as.character()
page_num <- as.character()

for (i in 1:page) {
  
  page_title <- read_html(html_page) %>% 
    html_nodes(xpath = "//*[@id='browse-results']/div[1]/div/div/div/div/div[2]/h4[2]/a") %>% 
    html_text() 
  page_title <- gsub("\n","", page_title)
  # there are 36 videos in one page, we hope 100 videos for each topic, we first scrape 3 pages
  
  page_speaker <- read_html(html_page) %>% 
    html_nodes(xpath = "//*[@id='browse-results']/div[1]/div/div/div/div/div[2]/h4[1]") %>% 
    html_text() 
  
  page_views_times <- read_html(html_page) %>% 
    html_nodes(xpath = "//*[@id='browse-results']/div[1]/div/div/div/div/div[2]/div/span/span") %>% 
    html_text() 
  page_views_times <- gsub("\n","", page_views_times)
  
  page_page <- rep(i, times=length(page_title))
  
  next_page <- remDr$findElement(using = 'link text', 'Next')
  remDr$mouseMoveToLocation(webElement=next_page)
  remDr$click(2)
  Sys.sleep(5)
  
  Sys.sleep(5)
  
  html_page <- remDr$getPageSource()[[1]]
  
  title <- append(title, page_title)
  speaker <- append(speaker, page_speaker)
  views_times <- append(views_times, page_views_times)
  page_num <- append(page_num, page_page)

}

browse_result <- data.frame()
browse_result <- data.frame(
  "page" = page_num,
  "title" = title,
  "speaker" = speaker,
  "views_times" = views_times,
  "cate" = "Climate Change")
  • Click in the search box, use the videos’ title name to search the corresponding video then always click the first result after searching by using it’s xpath.
#click in each video to capture infos
introduction <- as.character()
likes <- as.character()
tanscript <- as.character()
title_re <- as.character()
n <- length(waitforscrape$title)

  for (i in 1:n) {
    
    Sys.sleep(3)
    
    search <- remDr$findElement(using = 'xpath', '//*[@id="filters"]/div[1]/div/div[2]/div[1]/div[1]/div/div[1]/div/input')
    search$clickElement()
    Sys.sleep(5)
     
    search$clearElement()
    search$sendKeysToElement(list(waitforscrape$title[i], key = "enter"))
    Sys.sleep(8)
    
    # click in the video
    video_page <- remDr$findElement(using = 'xpath', "//*[@id='browse-results']/div[1]/div[1]/div/div/div/div[2]/h4[2]/a")
    remDr$mouseMoveToLocation(webElement=video_page)
    remDr$click(2)
    Sys.sleep(15)
    
    video_title <- waitforscrape$title[i]
    
    # open transcript
    drop_down <- remDr$findElement(using = 'xpath', "//*[@id='maincontent']/div/div/div/div/div[2]/div[3]/div[2]/button")
    remDr$mouseMoveToLocation(webElement=drop_down)
    remDr$click(2)
    Sys.sleep(5)
    
    # begin to crawl infos
    html_page <- remDr$getPageSource()[[1]]
    
    video_sum <- read_html(html_page) %>% 
      html_nodes(xpath = "//*[@id='maincontent']/div/div/div/div/div[2]/div[3]/div[1]/div[2]/div/div") %>% 
      html_text() 
    video_sum <- video_sum[1]
    Sys.sleep(5)
    
    video_likes <- read_html(html_page) %>% 
      html_nodes(xpath = "//*[@id='maincontent']/div/div/div/div/div[2]/div[1]/div[3]/button[1]/div/div/span") %>% html_text() 
    Sys.sleep(5)
    
    video_tanscript <- read_html(html_page) %>% 
      html_nodes(xpath = "//*[@id='maincontent']/div/div/div/aside/div[2]/div[2]/div/div/div[1]") %>%
      html_text() 
    Sys.sleep(5)
    
    remDr$goBack()
    Sys.sleep(5)
    
    introduction <- append(introduction, video_sum)
    likes <- append(likes,video_likes)
    tanscript <- append(tanscript, video_tanscript)
    title_re <- append(title_re, video_title)
  }

video_info <- data.frame(
  "title" = title_re,
  "introduction" = introduction,
  "likes" = likes,
  "tanscript" = tanscript)
  • After clicking in each video’s page, we first clicked Read transcript button to extent the transcript text area. Then, we began to scrape the all related information that we might use in the following analysis.
  • After scraping the information of each videos, there was a step of going back to the browser result page.

As mentioned above, since during the process of scraping TED data, we found the for loop of clicking in each video and scraping text is often interrupted, and some xpaths would fail to use in the case of different day operations. In this case, we have adopted the following response methods:

  • As we obtained the list of videos’ title name first, we used the list of previously successfully obtained video information before interrupting to compare with the list of title names to obtain the list of videos to continue scraping.

  • We took turns using css, xpath,link text and partial link text four approaches to locate the position of the videos or the button of Read transcript and Next.

  • Considering this is a dynamic web crawling, we added Sys.sleep to each scraping and clicking step, so that the system could give the website react time.

  • We then saved the data in .csv format in the data folder

TED_2 <- video_info
TED <- left_join(title_all, TED_2, by="title")
fwrite(TED, file = here::here("data/TED.csv"))

Finally, we set the closing function at the end in case closing the browser incorrectly would influence future scraping the next time.

remDr$closeServer()
remDr$close()
rm(remDr)
rm(rD)
gc()

3. Tokenization

3.1 Wrangling and parsing data

After obtaining the transcript text from TED talk videos through web scraping and saving the scraped data in .csv format, we import the resulting data into our analysis. Specifically, we import two tables: “TED.csv” and “add_details_1.csv.” The first table contains 330 observations and 11 variables, while the second table contains 310 observations and 2 variables. These tables are stored in a data folder and serve as the primary source of data for our analysis.

# Import data
TED <- read_csv(here::here("data/TED.csv")) #330 obs
views_add <- read_csv(here::here("data/add_details_1.csv")) #310 obs

kable(TED[1:10,], caption = "The example of original TED table") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
The example of original TED table
page.x title speaker.x views_times.x cate page.y speaker.y views_times.y introduction likes tanscript
1 How does artificial intelligence learn? Briana Brownell Mar 2021 AI 1 Briana Brownell Mar 2021

Today, artificial intelligence helps doctors diagnose patients, pilots fly commercial aircraft, and city planners predict traffic. These AIs are often self-taught, working off a simple set of instructions to create a unique array of rules and strategies. So how exactly does a machine learn? Briana Brownell digs into the three basic ways machines investigate, negotiate, and communicate. [Directed by Champ Panupong Techawongthawon, narrated by Safia Elhillo, music by Ambrose Yu].

S…
 (15K) Transcript (28 Languages)Bahasa IndonesiaDeutschEnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroPortuguês de PortugalRomânăTiếng ViệtTürkçeΕλληνικάРусскийСрпски, Srpskiעבריתالعربيةفارسىکوردی سۆرانیবাংলাதமிழ்ภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)日本語한국어00:09Today, artificial intelligence helps doctors diagnose patients, pilots fly commercial aircraft, and city planners predict traffic. But no matter what these AIs are doing, the computer scientists who designed them likely don’t know exactly how they’re doing it. This is because artificial intelligence is often self-taught, working off a simple set of instructions to create a unique array of rules and strategies. So how exactly does a machine learn? 00:39There are many different ways to build self-teaching programs. But they all rely on the three basic types of machine learning: unsupervised learning, supervised learning, and reinforcement learning. To see these in action, let’s imagine researchers are trying to pull information from a set of medical data containing thousands of patient profiles. 01:01First up, unsupervised learning. This approach would be ideal for analyzing all the profiles to find general similarities and useful patterns. Maybe certain patients have similar disease presentations, or perhaps a treatment produces specific sets of side effects. This broad pattern-seeking approach can be used to identify similarities between patient profiles and find emerging patterns, all without human guidance. 01:28But let’s imagine doctors are looking for something more specific. These physicians want to create an algorithm for diagnosing a particular condition. They begin by collecting two sets of data— medical images and test results from both healthy patients and those diagnosed with the condition. Then, they input this data into a program designed to identify features shared by the sick patients but not the healthy patients. Based on how frequently it sees certain features, the program will assign values to those features’ diagnostic significance, generating an algorithm for diagnosing future patients. However, unlike unsupervised learning, doctors and computer scientists have an active role in what happens next. Doctors will make the final diagnosis and check the accuracy of the algorithm’s prediction. Then computer scientists can use the updated datasets to adjust the program’s parameters and improve its accuracy. This hands-on approach is called supervised learning. 02:27Now, let’s say these doctors want to design another algorithm to recommend treatment plans. Since these plans will be implemented in stages, and they may change depending on each individual’s response to treatments, the doctors decide to use reinforcement learning. This program uses an iterative approach to gather feedback about which medications, dosages and treatments are most effective. Then, it compares that data against each patient’s profile to create their unique, optimal treatment plan. As the treatments progress and the program receives more feedback, it can constantly update the plan for each patient. None of these three techniques are inherently smarter than any other. While some require more or less human intervention, they all have their own strengths and weaknesses which makes them best suited for certain tasks. However, by using them together, researchers can build complex AI systems, where individual programs can supervise and teach each other. For example, when our unsupervised learning program finds groups of patients that are similar, it could send that data to a connected supervised learning program. That program could then incorporate this information into its predictions. Or perhaps dozens of reinforcement learning programs might simulate potential patient outcomes to collect feedback about different treatment plans. 03:43There are numerous ways to create these machine-learning systems, and perhaps the most promising models are those that mimic the relationship between neurons in the brain. These artificial neural networks can use millions of connections to tackle difficult tasks like image recognition, speech recognition, and even language translation. However, the more self-directed these models become, the harder it is for computer scientists to determine how these self-taught algorithms arrive at their solution. Researchers are already looking at ways to make machine learning more transparent. But as AI becomes more involved in our everyday lives, these enigmatic decisions have increasingly large impacts on our work, health, and safety. So as machines continue learning to investigate, negotiate and communicate, we must also consider how to teach them to teach each other to operate ethically.
1 The danger of AI is weirder than you think Janelle Shane Oct 2019 AI 1 Janelle Shane Oct 2019 The danger of artificial intelligence isn’t that it’s going to rebel against us, but that it’s going to do exactly what we ask it to do, says AI researcher Janelle Shane. Sharing the weird, sometimes alarming antics of AI algorithms as they try to solve human problems – like creating new ice cream flavors or recognizing cars on the road – Shane shows why AI doesn’t yet measure up to real brains.  (95K) Transcript (25 Languages)Bahasa IndonesiaDeutschEnglishEspañolFrançaisHrvatskiItalianoMagyarNederlandsPolskiPortuguês brasileiroPortuguês de PortugalRomânăTürkçeČeštinaРусскийУкраїнськаעבריתالعربيةفارسىภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)한국어00:01So, artificial intelligence is known for disrupting all kinds of industries. What about ice cream? What kind of mind-blowing new flavors could we generate with the power of an advanced artificial intelligence? So I teamed up with a group of coders from Kealing Middle School to find out the answer to this question. They collected over 1,600 existing ice cream flavors, and together, we fed them to an algorithm to see what it would generate. And here are some of the flavors that the AI came up with. 00:40[Pumpkin Trash Break] 00:41(Laughter) 00:43[Peanut Butter Slime] 00:46[Strawberry Cream Disease] 00:48(Laughter) 00:50These flavors are not delicious, as we might have hoped they would be. So the question is: What happened? What went wrong? Is the AI trying to kill us? Or is it trying to do what we asked, and there was a problem? 01:06In movies, when something goes wrong with AI, it’s usually because the AI has decided that it doesn’t want to obey the humans anymore, and it’s got its own goals, thank you very much. In real life, though, the AI that we actually have is not nearly smart enough for that. It has the approximate computing power of an earthworm, or maybe at most a single honeybee, and actually, probably maybe less. Like, we’re constantly learning new things about brains that make it clear how much our AIs don’t measure up to real brains. So today’s AI can do a task like identify a pedestrian in a picture, but it doesn’t have a concept of what the pedestrian is beyond that it’s a collection of lines and textures and things. It doesn’t know what a human actually is. So will today’s AI do what we ask it to do? It will if it can, but it might not do what we actually want. 02:04So let’s say that you were trying to get an AI to take this collection of robot parts and assemble them into some kind of robot to get from Point A to Point B. Now, if you were going to try and solve this problem by writing a traditional-style computer program, you would give the program step-by-step instructions on how to take these parts, how to assemble them into a robot with legs and then how to use those legs to walk to Point B. But when you’re using AI to solve the problem, it goes differently. You don’t tell it how to solve the problem, you just give it the goal, and it has to figure out for itself via trial and error how to reach that goal. And it turns out that the way AI tends to solve this particular problem is by doing this: it assembles itself into a tower and then falls over and lands at Point B. And technically, this solves the problem. Technically, it got to Point B. The danger of AI is not that it’s going to rebel against us, it’s that it’s going to do exactly what we ask it to do. So then the trick of working with AI becomes: How do we set up the problem so that it actually does what we want? 03:14So this little robot here is being controlled by an AI. The AI came up with a design for the robot legs and then figured out how to use them to get past all these obstacles. But when David Ha set up this experiment, he had to set it up with very, very strict limits on how big the AI was allowed to make the legs, because otherwise … 03:43(Laughter) 03:48And technically, it got to the end of that obstacle course. So you see how hard it is to get AI to do something as simple as just walk. 03:57So seeing the AI do this, you may say, OK, no fair, you can’t just be a tall tower and fall over, you have to actually, like, use legs to walk. And it turns out, that doesn’t always work, either. This AI’s job was to move fast. They didn’t tell it that it had to run facing forward or that it couldn’t use its arms. So this is what you get when you train AI to move fast, you get things like somersaulting and silly walks. It’s really common. So is twitching along the floor in a heap. 04:32(Laughter) 04:35So in my opinion, you know what should have been a whole lot weirder is the “Terminator” robots. Hacking “The Matrix” is another thing that AI will do if you give it a chance. So if you train an AI in a simulation, it will learn how to do things like hack into the simulation’s math errors and harvest them for energy. Or it will figure out how to move faster by glitching repeatedly into the floor. When you’re working with AI, it’s less like working with another human and a lot more like working with some kind of weird force of nature. And it’s really easy to accidentally give AI the wrong problem to solve, and often we don’t realize that until something has actually gone wrong. 05:16So here’s an experiment I did, where I wanted the AI to copy paint colors, to invent new paint colors, given the list like the ones here on the left. And here’s what the AI actually came up with. 05:29[Sindis Poop, Turdly, Suffer, Gray Pubic] 05:32(Laughter) 05:39So technically, it did what I asked it to. I thought I was asking it for, like, nice paint color names, but what I was actually asking it to do was just imitate the kinds of letter combinations that it had seen in the original. And I didn’t tell it anything about what words mean, or that there are maybe some words that it should avoid using in these paint colors. So its entire world is the data that I gave it. Like with the ice cream flavors, it doesn’t know about anything else. 06:12So it is through the data that we often accidentally tell AI to do the wrong thing. This is a fish called a tench. And there was a group of researchers who trained an AI to identify this tench in pictures. But then when they asked it what part of the picture it was actually using to identify the fish, here’s what it highlighted. Yes, those are human fingers. Why would it be looking for human fingers if it’s trying to identify a fish? Well, it turns out that the tench is a trophy fish, and so in a lot of pictures that the AI had seen of this fish during training, the fish looked like this. 06:51(Laughter) 06:53And it didn’t know that the fingers aren’t part of the fish. 06:58So you see why it is so hard to design an AI that actually can understand what it’s looking at. And this is why designing the image recognition in self-driving cars is so hard, and why so many self-driving car failures are because the AI got confused. I want to talk about an example from 2016. There was a fatal accident when somebody was using Tesla’s autopilot AI, but instead of using it on the highway like it was designed for, they used it on city streets. And what happened was, a truck drove out in front of the car and the car failed to brake. Now, the AI definitely was trained to recognize trucks in pictures. But what it looks like happened is the AI was trained to recognize trucks on highway driving, where you would expect to see trucks from behind. Trucks on the side is not supposed to happen on a highway, and so when the AI saw this truck, it looks like the AI recognized it as most likely to be a road sign and therefore, safe to drive underneath. 08:04Here’s an AI misstep from a different field. Amazon recently had to give up on a résumé-sorting algorithm that they were working on when they discovered that the algorithm had learned to discriminate against women. What happened is they had trained it on example résumés of people who they had hired in the past. And from these examples, the AI learned to avoid the résumés of people who had gone to women’s colleges or who had the word “women” somewhere in their resume, as in, “women’s soccer team” or “Society of Women Engineers.” The AI didn’t know that it wasn’t supposed to copy this particular thing that it had seen the humans do. And technically, it did what they asked it to do. They just accidentally asked it to do the wrong thing. 08:46And this happens all the time with AI. AI can be really destructive and not know it. So the AIs that recommend new content in Facebook, in YouTube, they’re optimized to increase the number of clicks and views. And unfortunately, one way that they have found of doing this is to recommend the content of conspiracy theories or bigotry. The AIs themselves don’t have any concept of what this content actually is, and they don’t have any concept of what the consequences might be of recommending this content. 09:22So, when we’re working with AI, it’s up to us to avoid problems. And avoiding things going wrong, that may come down to the age-old problem of communication, where we as humans have to learn how to communicate with AI. We have to learn what AI is capable of doing and what it’s not, and to understand that, with its tiny little worm brain, AI doesn’t really understand what we’re trying to ask it to do. So in other words, we have to be prepared to work with AI that’s not the super-competent, all-knowing AI of science fiction. We have to be prepared to work with an AI that’s the one that we actually have in the present day. And present-day AI is plenty weird enough. 10:09Thank you. 10:11(Applause)
1 The wonderful and terrifying implications of computers that can learn Jeremy Howard Dec 2014 AI 1 Jeremy Howard Dec 2014 What happens when we teach a computer how to learn? Technologist Jeremy Howard shares some surprising new developments in the fast-moving field of deep learning, a technique that can give computers the ability to learn Chinese, or to recognize objects in photos, or to help think through a medical diagnosis. (One deep learning tool, after watching hours of YouTube, taught itself the concept of “cats.”) Get caught up on a field that will change the way the computers around you behav…  (80K) Transcript (26 Languages)DeutschEnglishEspañolFrançaisItalianoLietuvių kalbaMagyarNederlandsPolskiPortuguês brasileiroPortuguês de PortugalRomânăSvenskaTiếng ViệtTürkçeΕλληνικάРусскийУкраїнськаעבריתالعربيةفارسىภาษาไทย中文 (简体)中文 (繁體)日本語한국어FootnotesFootnotes00:00It used to be that if you wanted to get a computer to do something new, you would have to program it. Now, programming, for those of you here that haven’t done it yourself, requires laying out in excruciating detail every single step that you want the computer to do in order to achieve your goal. Now, if you want to do something that you don’t know how to do yourself, then this is going to be a great challenge. 00:24footnotefootnoteSo this was the challenge faced by this man, Arthur Samuel. In 1956, he wanted to get this computer to be able to beat him at checkers. How can you write a program, lay out in excruciating detail, how to be better than you at checkers? So he came up with an idea: he had the computer play against itself thousands of times and learn how to play checkers. And indeed it worked, and in fact, by 1962, this computer had beaten the Connecticut state champion. 00:55footnotefootnoteSo Arthur Samuel was the father of machine learning, and I have a great debt to him, because I am a machine learning practitioner. I was the president of Kaggle, a community of over 200,000 machine learning practictioners. Kaggle puts up competitions to try and get them to solve previously unsolved problems, and it’s been successful hundreds of times. So from this vantage point, I was able to find out a lot about what machine learning can do in the past, can do today, and what it could do in the future. Perhaps the first big success of machine learning commercially was Google. Google showed that it is possible to find information by using a computer algorithm, and this algorithm is based on machine learning. Since that time, there have been many commercial successes of machine learning. Companies like Amazon and Netflix use machine learning to suggest products that you might like to buy, movies that you might like to watch. Sometimes, it’s almost creepy. Companies like LinkedIn and Facebook sometimes will tell you about who your friends might be and you have no idea how it did it, and this is because it’s using the power of machine learning. These are algorithms that have learned how to do this from data rather than being programmed by hand. 02:07footnotefootnoteThis is also how IBM was successful in getting Watson to beat the two world champions at “Jeopardy,” answering incredibly subtle and complex questions like this one. [“The ancient ‘Lion of Nimrud’ went missing from this city’s national museum in 2003 (along with a lot of other stuff)”] This is also why we are now able to see the first self-driving cars. If you want to be able to tell the difference between, say, a tree and a pedestrian, well, that’s pretty important. We don’t know how to write those programs by hand, but with machine learning, this is now possible. And in fact, this car has driven over a million miles without any accidents on regular roads. 02:40footnotefootnoteSo we now know that computers can learn, and computers can learn to do things that we actually sometimes don’t know how to do ourselves, or maybe can do them better than us. One of the most amazing examples I’ve seen of machine learning happened on a project that I ran at Kaggle where a team run by a guy called Geoffrey Hinton from the University of Toronto won a competition for automatic drug discovery. Now, what was extraordinary here is not just that they beat all of the algorithms developed by Merck or the international academic community, but nobody on the team had any background in chemistry or biology or life sciences, and they did it in two weeks. How did they do this? They used an extraordinary algorithm called deep learning. So important was this that in fact the success was covered in The New York Times in a front page article a few weeks later. This is Geoffrey Hinton here on the left-hand side. Deep learning is an algorithm inspired by how the human brain works, and as a result it’s an algorithm which has no theoretical limitations on what it can do. The more data you give it and the more computation time you give it, the better it gets. 03:48footnotefootnoteThe New York Times also showed in this article another extraordinary result of deep learning which I’m going to show you now. It shows that computers can listen and understand. 04:00footnotefootnote(Video) Richard Rashid: Now, the last step that I want to be able to take in this process is to actually speak to you in Chinese. Now the key thing there is, we’ve been able to take a large amount of information from many Chinese speakers and produce a text-to-speech system that takes Chinese text and converts it into Chinese language, and then we’ve taken an hour or so of my own voice and we’ve used that to modulate the standard text-to-speech system so that it would sound like me. Again, the result’s not perfect. There are in fact quite a few errors. (In Chinese) (Applause) There’s much work to be done in this area. (In Chinese) (Applause) 05:01Jeremy Howard: Well, that was at a machine learning conference in China. It’s not often, actually, at academic conferences that you do hear spontaneous applause, although of course sometimes at TEDx conferences, feel free. Everything you saw there was happening with deep learning. (Applause) Thank you. The transcription in English was deep learning. The translation to Chinese and the text in the top right, deep learning, and the construction of the voice was deep learning as well. 05:26footnotefootnoteSo deep learning is this extraordinary thing. It’s a single algorithm that can seem to do almost anything, and I discovered that a year earlier, it had also learned to see. In this obscure competition from Germany called the German Traffic Sign Recognition Benchmark, deep learning had learned to recognize traffic signs like this one. Not only could it recognize the traffic signs better than any other algorithm, the leaderboard actually showed it was better than people, about twice as good as people. So by 2011, we had the first example of computers that can see better than people. Since that time, a lot has happened. In 2012, Google announced that they had a deep learning algorithm watch YouTube videos and crunched the data on 16,000 computers for a month, and the computer independently learned about concepts such as people and cats just by watching the videos. This is much like the way that humans learn. Humans don’t learn by being told what they see, but by learning for themselves what these things are. Also in 2012, Geoffrey Hinton, who we saw earlier, won the very popular ImageNet competition, looking to try to figure out from one and a half million images what they’re pictures of. As of 2014, we’re now down to a six percent error rate in image recognition. This is better than people, again. 06:41footnotefootnoteSo machines really are doing an extraordinarily good job of this, and it is now being used in industry. For example, Google announced last year that they had mapped every single location in France in two hours, and the way they did it was that they fed street view images into a deep learning algorithm to recognize and read street numbers. Imagine how long it would have taken before: dozens of people, many years. This is also happening in China. Baidu is kind of the Chinese Google, I guess, and what you see here in the top left is an example of a picture that I uploaded to Baidu’s deep learning system, and underneath you can see that the system has understood what that picture is and found similar images. The similar images actually have similar backgrounds, similar directions of the faces, even some with their tongue out. This is not clearly looking at the text of a web page. All I uploaded was an image. So we now have computers which really understand what they see and can therefore search databases of hundreds of millions of images in real time. 07:46footnotefootnoteSo what does it mean now that computers can see? Well, it’s not just that computers can see. In fact, deep learning has done more than that. Complex, nuanced sentences like this one are now understandable with deep learning algorithms. As you can see here, this Stanford-based system showing the red dot at the top has figured out that this sentence is expressing negative sentiment. Deep learning now in fact is near human performance at understanding what sentences are about and what it is saying about those things. Also, deep learning has been used to read Chinese, again at about native Chinese speaker level. This algorithm developed out of Switzerland by people, none of whom speak or understand any Chinese. As I say, using deep learning is about the best system in the world for this, even compared to native human understanding. 08:36This is a system that we put together at my company which shows putting all this stuff together. These are pictures which have no text attached, and as I’m typing in here sentences, in real time it’s understanding these pictures and figuring out what they’re about and finding pictures that are similar to the text that I’m writing. So you can see, it’s actually understanding my sentences and actually understanding these pictures. I know that you’ve seen something like this on Google, where you can type in things and it will show you pictures, but actually what it’s doing is it’s searching the webpage for the text. This is very different from actually understanding the images. This is something that computers have only been able to do for the first time in the last few months. 09:17footnotefootnoteSo we can see now that computers can not only see but they can also read, and, of course, we’ve shown that they can understand what they hear. Perhaps not surprising now that I’m going to tell you they can write. Here is some text that I generated using a deep learning algorithm yesterday. And here is some text that an algorithm out of Stanford generated. Each of these sentences was generated by a deep learning algorithm to describe each of those pictures. This algorithm before has never seen a man in a black shirt playing a guitar. It’s seen a man before, it’s seen black before, it’s seen a guitar before, but it has independently generated this novel description of this picture. We’re still not quite at human performance here, but we’re close. In tests, humans prefer the computer-generated caption one out of four times. Now this system is now only two weeks old, so probably within the next year, the computer algorithm will be well past human performance at the rate things are going. So computers can also write. 10:16footnotefootnoteSo we put all this together and it leads to very exciting opportunities. For example, in medicine, a team in Boston announced that they had discovered dozens of new clinically relevant features of tumors which help doctors make a prognosis of a cancer. Very similarly, in Stanford, a group there announced that, looking at tissues under magnification, they’ve developed a machine learning-based system which in fact is better than human pathologists at predicting survival rates for cancer sufferers. In both of these cases, not only were the predictions more accurate, but they generated new insightful science. In the radiology case, they were new clinical indicators that humans can understand. In this pathology case, the computer system actually discovered that the cells around the cancer are as important as the cancer cells themselves in making a diagnosis. This is the opposite of what pathologists had been taught for decades. In each of those two cases, they were systems developed by a combination of medical experts and machine learning experts, but as of last year, we’re now beyond that too. This is an example of identifying cancerous areas of human tissue under a microscope. The system being shown here can identify those areas more accurately, or about as accurately, as human pathologists, but was built entirely with deep learning using no medical expertise by people who have no background in the field. Similarly, here, this neuron segmentation. We can now segment neurons about as accurately as humans can, but this system was developed with deep learning using people with no previous background in medicine. 11:56footnotefootnoteSo myself, as somebody with no previous background in medicine, I seem to be entirely well qualified to start a new medical company, which I did. I was kind of terrified of doing it, but the theory seemed to suggest that it ought to be possible to do very useful medicine using just these data analytic techniques. And thankfully, the feedback has been fantastic, not just from the media but from the medical community, who have been very supportive. The theory is that we can take the middle part of the medical process and turn that into data analysis as much as possible, leaving doctors to do what they’re best at. I want to give you an example. It now takes us about 15 minutes to generate a new medical diagnostic test and I’ll show you that in real time now, but I’ve compressed it down to three minutes by cutting some pieces out. Rather than showing you creating a medical diagnostic test, I’m going to show you a diagnostic test of car images, because that’s something we can all understand. 12:54So here we’re starting with about 1.5 million car images, and I want to create something that can split them into the angle of the photo that’s being taken. So these images are entirely unlabeled, so I have to start from scratch. With our deep learning algorithm, it can automatically identify areas of structure in these images. So the nice thing is that the human and the computer can now work together. So the human, as you can see here, is telling the computer about areas of interest which it wants the computer then to try and use to improve its algorithm. Now, these deep learning systems actually are in 16,000-dimensional space, so you can see here the computer rotating this through that space, trying to find new areas of structure. And when it does so successfully, the human who is driving it can then point out the areas that are interesting. So here, the computer has successfully found areas, for example, angles. So as we go through this process, we’re gradually telling the computer more and more about the kinds of structures we’re looking for. You can imagine in a diagnostic test this would be a pathologist identifying areas of pathosis, for example, or a radiologist indicating potentially troublesome nodules. And sometimes it can be difficult for the algorithm. In this case, it got kind of confused. The fronts and the backs of the cars are all mixed up. So here we have to be a bit more careful, manually selecting these fronts as opposed to the backs, then telling the computer that this is a type of group that we’re interested in. 14:21So we do that for a while, we skip over a little bit, and then we train the machine learning algorithm based on these couple of hundred things, and we hope that it’s gotten a lot better. You can see, it’s now started to fade some of these pictures out, showing us that it already is recognizing how to understand some of these itself. We can then use this concept of similar images, and using similar images, you can now see, the computer at this point is able to entirely find just the fronts of cars. So at this point, the human can tell the computer, okay, yes, you’ve done a good job of that. 14:53Sometimes, of course, even at this point it’s still difficult to separate out groups. In this case, even after we let the computer try to rotate this for a while, we still find that the left sides and the right sides pictures are all mixed up together. So we can again give the computer some hints, and we say, okay, try and find a projection that separates out the left sides and the right sides as much as possible using this deep learning algorithm. And giving it that hint – ah, okay, it’s been successful. It’s managed to find a way of thinking about these objects that’s separated out these together. 15:26So you get the idea here. This is a case not where the human is being replaced by a computer, but where they’re working together. What we’re doing here is we’re replacing something that used to take a team of five or six people about seven years and replacing it with something that takes 15 minutes for one person acting alone. 15:50So this process takes about four or five iterations. You can see we now have 62 percent of our 1.5 million images classified correctly. And at this point, we can start to quite quickly grab whole big sections, check through them to make sure that there’s no mistakes. Where there are mistakes, we can let the computer know about them. And using this kind of process for each of the different groups, we are now up to an 80 percent success rate in classifying the 1.5 million images. And at this point, it’s just a case of finding the small number that aren’t classified correctly, and trying to understand why. And using that approach, by 15 minutes we get to 97 percent classification rates. 16:31footnotefootnoteSo this kind of technique could allow us to fix a major problem, which is that there’s a lack of medical expertise in the world. The World Economic Forum says that there’s between a 10x and a 20x shortage of physicians in the developing world, and it would take about 300 years to train enough people to fix that problem. So imagine if we can help enhance their efficiency using these deep learning approaches? 16:56footnotefootnoteSo I’m very excited about the opportunities. I’m also concerned about the problems. The problem here is that every area in blue on this map is somewhere where services are over 80 percent of employment. What are services? These are services. These are also the exact things that computers have just learned how to do. So 80 percent of the world’s employment in the developed world is stuff that computers have just learned how to do. What does that mean? Well, it’ll be fine. They’ll be replaced by other jobs. For example, there will be more jobs for data scientists. Well, not really. It doesn’t take data scientists very long to build these things. For example, these four algorithms were all built by the same guy. So if you think, oh, it’s all happened before, we’ve seen the results in the past of when new things come along and they get replaced by new jobs, what are these new jobs going to be? It’s very hard for us to estimate this, because human performance grows at this gradual rate, but we now have a system, deep learning, that we know actually grows in capability exponentially. And we’re here. So currently, we see the things around us and we say, “Oh, computers are still pretty dumb.” Right? But in five years’ time, computers will be off this chart. So we need to be starting to think about this capability right now. 18:10We have seen this once before, of course. In the Industrial Revolution, we saw a step change in capability thanks to engines. The thing is, though, that after a while, things flattened out. There was social disruption, but once engines were used to generate power in all the situations, things really settled down. The Machine Learning Revolution is going to be very different from the Industrial Revolution, because the Machine Learning Revolution, it never settles down. The better computers get at intellectual activities, the more they can build better computers to be better at intellectual capabilities, so this is going to be a kind of change that the world has actually never experienced before, so your previous understanding of what’s possible is different. 18:50This is already impacting us. In the last 25 years, as capital productivity has increased, labor productivity has been flat, in fact even a little bit down. 19:01footnotefootnoteSo I want us to start having this discussion now. I know that when I often tell people about this situation, people can be quite dismissive. Well, computers can’t really think, they don’t emote, they don’t understand poetry, we don’t really understand how they work. So what? Computers right now can do the things that humans spend most of their time being paid to do, so now’s the time to start thinking about how we’re going to adjust our social structures and economic structures to be aware of this new reality. Thank you. (Applause)
1 How do we find dignity at work? Roy Bahat and Bryn Freedman Feb 2019 AI 1 Roy Bahat and Bryn Freedman Feb 2019 Roy Bahat was worried. His company invests in new technology like AI to make businesses more efficient – but, he wondered, what was AI doing to the people whose jobs might change, go away or become less fulfilling? The question sent him on a two-year research odyssey to discover what motivates people, and why we work. In this conversation with curator Bryn Freedman, he shares what he learned, including some surprising insights that will shape the conversation about the future of …  (66K) Transcript (19 Languages)DeutschEnglishEspañolFrançaisItalianoMagyarPortuguês brasileiroPortuguês de PortugalRomânăTiếng ViệtРусскийУкраїнськаעבריתالعربيةفارسى中文 (简体)中文 (繁體)日本語한국어00:00Bryn Freedman: You’re a guy whose company funds these AI programs and invests. So why should we trust you to not have a bias and tell us something really useful for the rest of us about the future of work? 00:17Roy Bahat: Yes, I am. And when you wake up in the morning and you read the newspaper and it says, “The robots are coming, they may take all our jobs,” as a start-up investor focused on the future of work, our fund was the first one to say artificial intelligence should be a focus for us. 00:32So I woke up one morning and read that and said, “Oh, my gosh, they’re talking about me. That’s me who’s doing that.” And then I thought: wait a minute. If things continue, then maybe not only will the start-ups in which we invest struggle because there won’t be people to have jobs to pay for the things that they make and buy them, but our economy and society might struggle, too. 00:57And look, I should be the guy who sits here and tells you, “Everything is going to be fine. It’s all going to work out great. Hey, when they introduced the ATM machine, years later, there’s more tellers in banks.” It’s true. And yet, when I looked at it, I thought, “This is going to accelerate. And if it does accelerate, there’s a chance the center doesn’t hold.” But I figured somebody must know the answer to this; there are so many ideas out there. And I read all the books, and I went to the conferences, and at one point, we counted more than 100 efforts to study the future of work. And it was a frustrating experience, because I’d hear the same back-and-forth over and over again: “The robots are coming!” And then somebody else would say, “Oh, don’t worry about that, they’ve always said that and it turns out OK.” Then somebody else would say, “Well, it’s really about the meaning of your job, anyway.” And then everybody would shrug and go off and have a drink. And it felt like there was this Kabuki theater of this discussion, where nobody was talking to each other. 01:54And many of the people that I knew and worked with in the technology world were not speaking to policy makers; the policy makers were not speaking to them. And so we partnered with a nonpartisan think tank NGO called New America to study this issue. And we brought together a group of people, including an AI czar at a technology company and a video game designer and a heartland conservative and a Wall Street investor and a socialist magazine editor – literally, all in the same room; it was occasionally awkward – to try to figure out what is it that will happen here. 02:25The question we asked was simple. It was: What is the effect of technology on work going to be? And we looked out 10 to 20 years, because we wanted to look out far enough that there could be real change, but soon enough that we weren’t talking about teleportation or anything like that. And we recognized – and I think every year we’re reminded of this in the world – that predicting what’s going to happen is hard. So instead of predicting, there are other things you can do. You can try to imagine alternate possible futures, which is what we did. We did a scenario-planning exercise, and we imagined cases where no job is safe. We imagined cases where every job is safe. And we imagined every distinct possibility we could. 03:06And the result, which really surprised us, was when you think through those futures and you think what should we do, the answers about what we should do actually turn out to be the same, no matter what happens. And the irony of looking out 10 to 20 years into the future is, you realize that the things we want to act on are actually already happening right now. The automation is right now, the future is right now. 03:30BF: So what does that mean, and what does that tell us? If the future is now, what is it that we should be doing, and what should we be thinking about? 03:37RB: We have to understand the problem first. And so the data are that as the economy becomes more productive and individual workers become more productive, their wages haven’t risen. If you look at the proportion of prime working-age men, in the United States at least, who work now versus in 1960, we have three times as many men not working. And then you hear the stories. 03:59I sat down with a group of Walmart workers and said, “What do you think about this cashier, this futuristic self-checkout thing?” They said, “That’s nice, but have you heard about the cash recycler? That’s a machine that’s being installed right now, and is eliminating two jobs at every Walmart right now.” And so we just thought, “Geez. We don’t understand the problem.” And so we looked at the voices that were the ones that were excluded, which is all of the people affected by this change. And we decided to listen to them, sort of “automation and its discontents.” 04:26And I’ve spent the last couple of years doing that. I’ve been to Flint, Michigan, and Youngstown, Ohio, talking about entrepreneurs, trying to make it work in a very different environment from New York or San Francisco or London or Tokyo. I’ve been to prisons twice to talk to inmates about their jobs after they leave. I’ve sat down with truck drivers to ask them about the self-driving truck, with people who, in addition to their full-time job, care for an aging relative. And when you talk to people, there were two themes that came out loud and clear. 04:56The first one was that people are less looking for more money or get out of the fear of the robot taking their job, and they just want something stable. They want something predictable. So if you survey people and ask them what they want out of work, for everybody who makes less than 150,000 dollars a year, they’ll take a more stable and secure income, on average, over earning more money. And if you think about the fact that not only for all of the people across the earth who don’t earn a living, but for those who do, the vast majority earn a different amount from month to month and have an instability, all of a sudden you realize, “Wait a minute. We have a real problem on our hands.” 05:35And the second thing they say, which took us a longer time to understand, is they say they want dignity. And that concept of self-worth through work emerged again and again and again in our conversations. 05:49BF: So, I certainly appreciate this answer. But you can’t eat dignity, you can’t clothe your children with self-esteem. So, what is that, how do you reconcile – what does dignity mean, and what is the relationship between dignity and stability? 06:06RB: You can’t eat dignity. You need stability first. And the good news is, many of the conversations that are happening right now are about how we solve that. You know, I’m a proponent of studying guaranteed income, as one example, conversations about how health care gets provided and other benefits. Those conversations are happening, and we’re at a time where we must figure that out. It is the crisis of our era. 06:28And my point of view after talking to people is that we may do that, and it still might not be enough. Because what we need to do from the beginning is understand what is it about work that gives people dignity, so they can live the lives that they want to live. And so that concept of dignity is … it’s difficult to get your hands around, because when many people hear it – especially, to be honest, rich people – they hear “meaning.” They hear “My work is important to me.” And again, if you survey people and you ask them, “How important is it to you that your work be important to you?” only people who make 150,000 dollars a year or more say that it is important to them that their work be important. 07:12BF: Meaning, meaningful? 07:13RB: Just defined as, “Is your work important to you?” Whatever somebody took that to mean. And yet, of course dignity is essential. We talked to truck drivers who said, “I saw my cousin drive, and I got on the open road and it was amazing. And I started making more money than people who went to college.” Then they’d get to the end of their thought and say something like, “People need their fruits and vegetables in the morning, and I’m the guy who gets it to them.” 07:38We talked to somebody who, in addition to his job, was caring for his aunt. He was making plenty of money. At one point we just asked, “What is it about caring for your aunt? Can’t you just pay somebody to do it?” He said, “My aunt doesn’t want somebody we pay for. My aunt wants me.” So there was this concept there of being needed. 07:56If you study the word “dignity,” it’s fascinating. It’s one of the oldest words in the English language, from antiquity. And it has two meanings: one is self-worth, and the other is that something is suitable, it’s fitting, meaning that you’re part of something greater than yourself, and it connects to some broader whole. In other words, that you’re needed. 08:15BF: So how do you answer this question, this concept that we don’t pay teachers, and we don’t pay eldercare workers, and we don’t pay people who really care for people and are needed, enough? 08:26RB: Well, the good news is, people are finally asking the question. So as AI investors, we often get phone calls from foundations or CEOs and boardrooms saying, “What do we do about this?” And they used to be asking, “What do we do about introducing automation?” And now they’re asking, “What do we do about self-worth?” And they know that the employees who work for them who have a spouse who cares for somebody, that dignity is essential to their ability to just do their job. 08:50I think there’s two kinds of answers: there’s the money side of just making your life work. That’s stability. You need to eat. And then you think about our culture more broadly, and you ask: Who do we make into heroes? And, you know, what I want is to see the magazine cover that is the person who is the heroic caregiver. Or the Netflix series that dramatizes the person who makes all of our other lives work so we can do the things we do. Let’s make heroes out of those people. That’s the Netflix show that I would binge. 09:20And we’ve had chroniclers of this before – Studs Terkel, the oral history of the working experience in the United States. And what we need is the experience of needing one another and being connected to each other. Maybe that’s the answer for how we all fit as a society. And the thought exercise, to me, is: if you were to go back 100 years and have people – my grandparents, great-grandparents, a tailor, worked in a mine – they look at what all of us do for a living and say, “That’s not work.” We sit there and type and talk, and there’s no danger of getting hurt. And my guess is that if you were to imagine 100 years from now, we’ll still be doing things for each other. We’ll still need one another. And we just will think of it as work. 10:00The entire thing I’m trying to say is that dignity should not just be about having a job. Because if you say you need a job to have dignity, which many people say, the second you say that, you say to all the parents and all the teachers and all the caregivers that all of a sudden, because they’re not being paid for what they’re doing, it somehow lacks this essential human quality. To me, that’s the great puzzle of our time: Can we figure out how to provide that stability throughout life, and then can we figure out how to create an inclusive, not just racially, gender, but multigenerationally inclusive – I mean, every different human experience included – in this way of understanding how we can be needed by one another. 10:41BF: Thank you. RB: Thank you. 10:42BF: Thank you very much for your participation. 10:44(Applause)
1 The incredible inventions of intuitive AI Maurice Conti Feb 2017 AI 1 Maurice Conti Feb 2017 What do you get when you give a design tool a digital nervous system? Computers that improve our ability to think and imagine, and robotic systems that come up with (and build) radical new designs for bridges, cars, drones and much more – all by themselves. Take a tour of the Augmented Age with futurist Maurice Conti and preview a time when robots and humans will work side-by-side to accomplish things neither could do alone.  (220K) Transcript (23 Languages)EnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroRomânăSuomiTiếng ViệtTürkçeČeštinaΕλληνικάРусскийСрпски, SrpskiУкраїнськаעבריתالعربيةفارسى中文 (简体)中文 (繁體)日本語한국어Leslie Gauthier, TranslatorCamille Martínez, Reviewer00:00How many of you are creatives, designers, engineers, entrepreneurs, artists, or maybe you just have a really big imagination? Show of hands? (Cheers) 00:10That’s most of you. I have some news for us creatives. Over the course of the next 20 years, more will change around the way we do our work than has happened in the last 2,000. In fact, I think we’re at the dawn of a new age in human history. 00:33Now, there have been four major historical eras defined by the way we work. The Hunter-Gatherer Age lasted several million years. And then the Agricultural Age lasted several thousand years. The Industrial Age lasted a couple of centuries. And now the Information Age has lasted just a few decades. And now today, we’re on the cusp of our next great era as a species. 01:01Welcome to the Augmented Age. In this new era, your natural human capabilities are going to be augmented by computational systems that help you think, robotic systems that help you make, and a digital nervous system that connects you to the world far beyond your natural senses. Let’s start with cognitive augmentation. How many of you are augmented cyborgs? 01:24(Laughter) 01:26I would actually argue that we’re already augmented. Imagine you’re at a party, and somebody asks you a question that you don’t know the answer to. If you have one of these, in a few seconds, you can know the answer. But this is just a primitive beginning. Even Siri is just a passive tool. In fact, for the last three-and-a-half million years, the tools that we’ve had have been completely passive. They do exactly what we tell them and nothing more. Our very first tool only cut where we struck it. The chisel only carves where the artist points it. And even our most advanced tools do nothing without our explicit direction. In fact, to date, and this is something that frustrates me, we’ve always been limited by this need to manually push our wills into our tools – like, manual, literally using our hands, even with computers. But I’m more like Scotty in “Star Trek.” 02:26(Laughter) 02:28I want to have a conversation with a computer. I want to say, “Computer, let’s design a car,” and the computer shows me a car. And I say, “No, more fast-looking, and less German,” and bang, the computer shows me an option. 02:39(Laughter) 02:42That conversation might be a little ways off, probably less than many of us think, but right now, we’re working on it. Tools are making this leap from being passive to being generative. Generative design tools use a computer and algorithms to synthesize geometry to come up with new designs all by themselves. All it needs are your goals and your constraints. 03:06I’ll give you an example. In the case of this aerial drone chassis, all you would need to do is tell it something like, it has four propellers, you want it to be as lightweight as possible, and you need it to be aerodynamically efficient. Then what the computer does is it explores the entire solution space: every single possibility that solves and meets your criteria – millions of them. It takes big computers to do this. But it comes back to us with designs that we, by ourselves, never could’ve imagined. And the computer’s coming up with this stuff all by itself – no one ever drew anything, and it started completely from scratch. And by the way, it’s no accident that the drone body looks just like the pelvis of a flying squirrel. 03:51(Laughter) 03:54It’s because the algorithms are designed to work the same way evolution does. 03:58What’s exciting is we’re starting to see this technology out in the real world. We’ve been working with Airbus for a couple of years on this concept plane for the future. It’s a ways out still. But just recently we used a generative-design AI to come up with this. This is a 3D-printed cabin partition that’s been designed by a computer. It’s stronger than the original yet half the weight, and it will be flying in the Airbus A320 later this year. So computers can now generate; they can come up with their own solutions to our well-defined problems. But they’re not intuitive. They still have to start from scratch every single time, and that’s because they never learn. Unlike Maggie. 04:44(Laughter) 04:45Maggie’s actually smarter than our most advanced design tools. What do I mean by that? If her owner picks up that leash, Maggie knows with a fair degree of certainty it’s time to go for a walk. And how did she learn? Well, every time the owner picked up the leash, they went for a walk. And Maggie did three things: she had to pay attention, she had to remember what happened and she had to retain and create a pattern in her mind. 05:11Interestingly, that’s exactly what computer scientists have been trying to get AIs to do for the last 60 or so years. Back in 1952, they built this computer that could play Tic-Tac-Toe. Big deal. Then 45 years later, in 1997, Deep Blue beats Kasparov at chess. 2011, Watson beats these two humans at Jeopardy, which is much harder for a computer to play than chess is. In fact, rather than working from predefined recipes, Watson had to use reasoning to overcome his human opponents. And then a couple of weeks ago, DeepMind’s AlphaGo beats the world’s best human at Go, which is the most difficult game that we have. In fact, in Go, there are more possible moves than there are atoms in the universe. So in order to win, what AlphaGo had to do was develop intuition. And in fact, at some points, AlphaGo’s programmers didn’t understand why AlphaGo was doing what it was doing. 06:19And things are moving really fast. I mean, consider – in the space of a human lifetime, computers have gone from a child’s game to what’s recognized as the pinnacle of strategic thought. What’s basically happening is computers are going from being like Spock to being a lot more like Kirk. 06:39(Laughter) 06:43Right? From pure logic to intuition. Would you cross this bridge? Most of you are saying, “Oh, hell no!” 06:52(Laughter) 06:54And you arrived at that decision in a split second. You just sort of knew that bridge was unsafe. And that’s exactly the kind of intuition that our deep-learning systems are starting to develop right now. Very soon, you’ll literally be able to show something you’ve made, you’ve designed, to a computer, and it will look at it and say, “Sorry, homie, that’ll never work. You have to try again.” Or you could ask it if people are going to like your next song, or your next flavor of ice cream. Or, much more importantly, you could work with a computer to solve a problem that we’ve never faced before. For instance, climate change. We’re not doing a very good job on our own, we could certainly use all the help we can get. That’s what I’m talking about, technology amplifying our cognitive abilities so we can imagine and design things that were simply out of our reach as plain old un-augmented humans. 07:47So what about making all of this crazy new stuff that we’re going to invent and design? I think the era of human augmentation is as much about the physical world as it is about the virtual, intellectual realm. How will technology augment us? In the physical world, robotic systems. OK, there’s certainly a fear that robots are going to take jobs away from humans, and that is true in certain sectors. But I’m much more interested in this idea that humans and robots working together are going to augment each other, and start to inhabit a new space. 08:24This is our applied research lab in San Francisco, where one of our areas of focus is advanced robotics, specifically, human-robot collaboration. And this is Bishop, one of our robots. As an experiment, we set it up to help a person working in construction doing repetitive tasks – tasks like cutting out holes for outlets or light switches in drywall. 08:46(Laughter) 08:49So, Bishop’s human partner can tell what to do in plain English and with simple gestures, kind of like talking to a dog, and then Bishop executes on those instructions with perfect precision. We’re using the human for what the human is good at: awareness, perception and decision making. And we’re using the robot for what it’s good at: precision and repetitiveness. 09:10Here’s another cool project that Bishop worked on. The goal of this project, which we called the HIVE, was to prototype the experience of humans, computers and robots all working together to solve a highly complex design problem. The humans acted as labor. They cruised around the construction site, they manipulated the bamboo – which, by the way, because it’s a non-isomorphic material, is super hard for robots to deal with. But then the robots did this fiber winding, which was almost impossible for a human to do. And then we had an AI that was controlling everything. It was telling the humans what to do, telling the robots what to do and keeping track of thousands of individual components. What’s interesting is, building this pavilion was simply not possible without human, robot and AI augmenting each other. 09:57OK, I’ll share one more project. This one’s a little bit crazy. We’re working with Amsterdam-based artist Joris Laarman and his team at MX3D to generatively design and robotically print the world’s first autonomously manufactured bridge. So, Joris and an AI are designing this thing right now, as we speak, in Amsterdam. And when they’re done, we’re going to hit “Go,” and robots will start 3D printing in stainless steel, and then they’re going to keep printing, without human intervention, until the bridge is finished. 10:29So, as computers are going to augment our ability to imagine and design new stuff, robotic systems are going to help us build and make things that we’ve never been able to make before. But what about our ability to sense and control these things? What about a nervous system for the things that we make? 10:48Our nervous system, the human nervous system, tells us everything that’s going on around us. But the nervous system of the things we make is rudimentary at best. For instance, a car doesn’t tell the city’s public works department that it just hit a pothole at the corner of Broadway and Morrison. A building doesn’t tell its designers whether or not the people inside like being there, and the toy manufacturer doesn’t know if a toy is actually being played with – how and where and whether or not it’s any fun. Look, I’m sure that the designers imagined this lifestyle for Barbie when they designed her. 11:22(Laughter) 11:24But what if it turns out that Barbie’s actually really lonely? 11:27(Laughter) 11:31If the designers had known what was really happening in the real world with their designs – the road, the building, Barbie – they could’ve used that knowledge to create an experience that was better for the user. What’s missing is a nervous system connecting us to all of the things that we design, make and use. What if all of you had that kind of information flowing to you from the things you create in the real world? With all of the stuff we make, we spend a tremendous amount of money and energy – in fact, last year, about two trillion dollars – convincing people to buy the things we’ve made. But if you had this connection to the things that you design and create after they’re out in the real world, after they’ve been sold or launched or whatever, we could actually change that, and go from making people want our stuff, to just making stuff that people want in the first place. 12:21The good news is, we’re working on digital nervous systems that connect us to the things we design. We’re working on one project with a couple of guys down in Los Angeles called the Bandito Brothers and their team. And one of the things these guys do is build insane cars that do absolutely insane things. These guys are crazy – 12:44(Laughter) 12:45in the best way. And what we’re doing with them is taking a traditional race-car chassis and giving it a nervous system. 12:54So we instrumented it with dozens of sensors, put a world-class driver behind the wheel, took it out to the desert and drove the hell out of it for a week. And the car’s nervous system captured everything that was happening to the car. We captured four billion data points; all of the forces that it was subjected to. And then we did something crazy. We took all of that data, and plugged it into a generative-design AI we call “Dreamcatcher.” So what do get when you give a design tool a nervous system, and you ask it to build you the ultimate car chassis? You get this. This is something that a human could never have designed. Except a human did design this, but it was a human that was augmented by a generative-design AI, a digital nervous system and robots that can actually fabricate something like this. 13:47So if this is the future, the Augmented Age, and we’re going to be augmented cognitively, physically and perceptually, what will that look like? What is this wonderland going to be like? 14:00I think we’re going to see a world where we’re moving from things that are fabricated to things that are farmed. Where we’re moving from things that are constructed to that which is grown. We’re going to move from being isolated to being connected. And we’ll move away from extraction to embrace aggregation. I also think we’ll shift from craving obedience from our things to valuing autonomy. 14:30Thanks to our augmented capabilities, our world is going to change dramatically. We’re going to have a world with more variety, more connectedness, more dynamism, more complexity, more adaptability and, of course, more beauty. The shape of things to come will be unlike anything we’ve ever seen before. Why? Because what will be shaping those things is this new partnership between technology, nature and humanity. That, to me, is a future well worth looking forward to. 15:03Thank you all so much. 15:04(Applause)
1 How AI can bring on a second Industrial Revolution Kevin Kelly Dec 2016 AI 1 Kevin Kelly Dec 2016 “The actual path of a raindrop as it goes down the valley is unpredictable, but the general direction is inevitable,” says digital visionary Kevin Kelly – and technology is much the same, driven by patterns that are surprising but inevitable. Over the next 20 years, he says, our penchant for making things smarter and smarter will have a profound impact on nearly everything we do. Kelly explores three trends in AI we need to understand in order to embrace it and steer its developm…  (55K) Transcript (24 Languages)EnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroPortuguês de PortugalTiếng ViệtTürkçeČeštinaΕλληνικάРусскийУкраїнськабългарскиעבריתالعربيةفارسىภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)日本語한국어Leslie Gauthier, TranslatorCamille Martínez, Reviewer00:02I’m going to talk a little bit about where technology’s going. And often technology comes to us, we’re surprised by what it brings. But there’s actually a large aspect of technology that’s much more predictable, and that’s because technological systems of all sorts have leanings, they have urgencies, they have tendencies. And those tendencies are derived from the very nature of the physics, chemistry of wires and switches and electrons, and they will make reoccurring patterns again and again. And so those patterns produce these tendencies, these leanings. 00:42You can almost think of it as sort of like gravity. Imagine raindrops falling into a valley. The actual path of a raindrop as it goes down the valley is unpredictable. We cannot see where it’s going, but the general direction is very inevitable: it’s downward. And so these baked-in tendencies and urgencies in technological systems give us a sense of where things are going at the large form. So in a large sense, I would say that telephones were inevitable, but the iPhone was not. The Internet was inevitable, but Twitter was not. 01:21So we have many ongoing tendencies right now, and I think one of the chief among them is this tendency to make things smarter and smarter. I call it cognifying – cognification – also known as artificial intelligence, or AI. And I think that’s going to be one of the most influential developments and trends and directions and drives in our society in the next 20 years. 01:48So, of course, it’s already here. We already have AI, and often it works in the background, in the back offices of hospitals, where it’s used to diagnose X-rays better than a human doctor. It’s in legal offices, where it’s used to go through legal evidence better than a human paralawyer. It’s used to fly the plane that you came here with. Human pilots only flew it seven to eight minutes, the rest of the time the AI was driving. And of course, in Netflix and Amazon, it’s in the background, making those recommendations. That’s what we have today. 02:22And we have an example, of course, in a more front-facing aspect of it, with the win of the AlphaGo, who beat the world’s greatest Go champion. But it’s more than that. If you play a video game, you’re playing against an AI. But recently, Google taught their AI to actually learn how to play video games. Again, teaching video games was already done, but learning how to play a video game is another step. That’s artificial smartness. What we’re doing is taking this artificial smartness and we’re making it smarter and smarter. 03:06There are three aspects to this general trend that I think are underappreciated; I think we would understand AI a lot better if we understood these three things. I think these things also would help us embrace AI, because it’s only by embracing it that we actually can steer it. We can actually steer the specifics by embracing the larger trend. 03:27So let me talk about those three different aspects. The first one is: our own intelligence has a very poor understanding of what intelligence is. We tend to think of intelligence as a single dimension, that it’s kind of like a note that gets louder and louder. It starts like with IQ measurement. It starts with maybe a simple low IQ in a rat or mouse, and maybe there’s more in a chimpanzee, and then maybe there’s more in a stupid person, and then maybe an average person like myself, and then maybe a genius. And this single IQ intelligence is getting greater and greater. That’s completely wrong. That’s not what intelligence is – not what human intelligence is, anyway. It’s much more like a symphony of different notes, and each of these notes is played on a different instrument of cognition. 04:15There are many types of intelligences in our own minds. We have deductive reasoning, we have emotional intelligence, we have spatial intelligence; we have maybe 100 different types that are all grouped together, and they vary in different strengths with different people. And of course, if we go to animals, they also have another basket – another symphony of different kinds of intelligences, and sometimes those same instruments are the same that we have. They can think in the same way, but they may have a different arrangement, and maybe they’re higher in some cases than humans, like long-term memory in a squirrel is actually phenomenal, so it can remember where it buried its nuts. But in other cases they may be lower. 04:58When we go to make machines, we’re going to engineer them in the same way, where we’ll make some of those types of smartness much greater than ours, and many of them won’t be anywhere near ours, because they’re not needed. So we’re going to take these things, these artificial clusters, and we’ll be adding more varieties of artificial cognition to our AIs. We’re going to make them very, very specific. 05:26So your calculator is smarter than you are in arithmetic already; your GPS is smarter than you are in spatial navigation; Google, Bing, are smarter than you are in long-term memory. And we’re going to take, again, these kinds of different types of thinking and we’ll put them into, like, a car. The reason why we want to put them in a car so the car drives, is because it’s not driving like a human. It’s not thinking like us. That’s the whole feature of it. It’s not being distracted, it’s not worrying about whether it left the stove on, or whether it should have majored in finance. It’s just driving. 06:05(Laughter) 06:06Just driving, OK? And we actually might even come to advertise these as “consciousness-free.” They’re without consciousness, they’re not concerned about those things, they’re not distracted. 06:17So in general, what we’re trying to do is make as many different types of thinking as we can. We’re going to populate the space of all the different possible types, or species, of thinking. And there actually may be some problems that are so difficult in business and science that our own type of human thinking may not be able to solve them alone. We may need a two-step program, which is to invent new kinds of thinking that we can work alongside of to solve these really large problems, say, like dark energy or quantum gravity. 06:56What we’re doing is making alien intelligences. You might even think of this as, sort of, artificial aliens in some senses. And they’re going to help us think different, because thinking different is the engine of creation and wealth and new economy. 07:13The second aspect of this is that we are going to use AI to basically make a second Industrial Revolution. The first Industrial Revolution was based on the fact that we invented something I would call artificial power. Previous to that, during the Agricultural Revolution, everything that was made had to be made with human muscle or animal power. That was the only way to get anything done. The great innovation during the Industrial Revolution was, we harnessed steam power, fossil fuels, to make this artificial power that we could use to do anything we wanted to do. So today when you drive down the highway, you are, with a flick of the switch, commanding 250 horses – 250 horsepower – which we can use to build skyscrapers, to build cities, to build roads, to make factories that would churn out lines of chairs or refrigerators way beyond our own power. And that artificial power can also be distributed on wires on a grid to every home, factory, farmstead, and anybody could buy that artificial power, just by plugging something in. 08:27So this was a source of innovation as well, because a farmer could take a manual hand pump, and they could add this artificial power, this electricity, and he’d have an electric pump. And you multiply that by thousands or tens of thousands of times, and that formula was what brought us the Industrial Revolution. All the things that we see, all this progress that we now enjoy, has come from the fact that we’ve done that. 08:50We’re going to do the same thing now with AI. We’re going to distribute that on a grid, and now you can take that electric pump. You can add some artificial intelligence, and now you have a smart pump. And that, multiplied by a million times, is going to be this second Industrial Revolution. So now the car is going down the highway, it’s 250 horsepower, but in addition, it’s 250 minds. That’s the auto-driven car. It’s like a new commodity; it’s a new utility. The AI is going to flow across the grid – the cloud – in the same way electricity did. 09:22So everything that we had electrified, we’re now going to cognify. And I would suggest, then, that the formula for the next 10,000 start-ups is very, very simple, which is to take x and add AI. That is the formula, that’s what we’re going to be doing. And that is the way in which we’re going to make this second Industrial Revolution. And by the way – right now, this minute, you can log on to Google and you can purchase AI for six cents, 100 hits. That’s available right now. 09:54So the third aspect of this is that when we take this AI and embody it, we get robots. And robots are going to be bots, they’re going to be doing many of the tasks that we have already done. A job is just a bunch of tasks, so they’re going to redefine our jobs because they’re going to do some of those tasks. But they’re also going to create whole new categories, a whole new slew of tasks that we didn’t know we wanted to do before. They’re going to actually engender new kinds of jobs, new kinds of tasks that we want done, just as automation made up a whole bunch of new things that we didn’t know we needed before, and now we can’t live without them. So they’re going to produce even more jobs than they take away, but it’s important that a lot of the tasks that we’re going to give them are tasks that can be defined in terms of efficiency or productivity. If you can specify a task, either manual or conceptual, that can be specified in terms of efficiency or productivity, that goes to the bots. Productivity is for robots. What we’re really good at is basically wasting time. 11:04(Laughter) 11:05We’re really good at things that are inefficient. Science is inherently inefficient. It runs on that fact that you have one failure after another. It runs on the fact that you make tests and experiments that don’t work, otherwise you’re not learning. It runs on the fact that there is not a lot of efficiency in it. Innovation by definition is inefficient, because you make prototypes, because you try stuff that fails, that doesn’t work. Exploration is inherently inefficiency. Art is not efficient. Human relationships are not efficient. These are all the kinds of things we’re going to gravitate to, because they’re not efficient. Efficiency is for robots. We’re also going to learn that we’re going to work with these AIs because they think differently than us. 11:50When Deep Blue beat the world’s best chess champion, people thought it was the end of chess. But actually, it turns out that today, the best chess champion in the world is not an AI. And it’s not a human. It’s the team of a human and an AI. The best medical diagnostician is not a doctor, it’s not an AI, it’s the team. We’re going to be working with these AIs, and I think you’ll be paid in the future by how well you work with these bots. So that’s the third thing, is that they’re different, they’re utility and they are going to be something we work with rather than against. We’re working with these rather than against them. 12:30So, the future: Where does that take us? I think that 25 years from now, they’ll look back and look at our understanding of AI and say, “You didn’t have AI. In fact, you didn’t even have the Internet yet, compared to what we’re going to have 25 years from now.” There are no AI experts right now. There’s a lot of money going to it, there are billions of dollars being spent on it; it’s a huge business, but there are no experts, compared to what we’ll know 20 years from now. So we are just at the beginning of the beginning, we’re in the first hour of all this. We’re in the first hour of the Internet. We’re in the first hour of what’s coming. The most popular AI product in 20 years from now, that everybody uses, has not been invented yet. That means that you’re not late. 13:23Thank you. 13:24(Laughter) 13:25(Applause)
1 How AI can enhance our memory, work and social lives Tom Gruber Aug 2017 AI 1 Tom Gruber Aug 2017 How smart can our machines make us? Tom Gruber, co-creator of Siri, wants to make “humanistic AI” that augments and collaborates with us instead of competing with (or replacing) us. He shares his vision for a future where AI helps us achieve superhuman performance in perception, creativity and cognitive function – from turbocharging our design skills to helping us remember everything we’ve ever read and the name of everyone we’ve ever met. “We are in the middle of a renaissance i…  (64K) Transcript (20 Languages)EnglishEspañolFrançaisItalianoMagyarNederlandsO’zbekPortuguês brasileiroTiếng ViệtTürkçeΕλληνικάРусскийУкраїнськаՀայրենالعربيةفارسى中文 (简体)中文 (繁體)日本語한국어FootnotesFootnotes00:01I’m here to offer you a new way to think about my field, artificial intelligence. I think the purpose of AI is to empower humans with machine intelligence. And as machines get smarter, we get smarter. I call this “humanistic AI” – artificial intelligence designed to meet human needs by collaborating and augmenting people. Now, today I’m happy to see that the idea of an intelligent assistant is mainstream. It’s the well-accepted metaphor for the interface between humans and AI. And the one I helped create is called Siri. 00:42footnotefootnoteYou know Siri. Siri is the thing that knows your intent and helps you do it for you, helps you get things done. But what you might not know is that we designed Siri as humanistic AI, to augment people with a conversational interface that made it possible for them to use mobile computing, regardless of who they were and their abilities. 01:06Now for most of us, the impact of this technology is to make things a little bit easier to use. But for my friend Daniel, the impact of the AI in these systems is a life changer. You see, Daniel is a really social guy, and he’s blind and quadriplegic, which makes it hard to use those devices that we all take for granted. The last time I was at his house, his brother said, “Hang on a second, Daniel’s not ready. He’s on the phone with a woman he met online.” I’m like, “That’s cool, how’d he do it?” Well, Daniel uses Siri to manage his own social life – his email, text and phone – without depending on his caregivers. This is kind of interesting, right? The irony here is great. Here’s the man whose relationship with AI helps him have relationships with genuine human beings. And this is humanistic AI. 02:07footnotefootnoteAnother example with life-changing consequences is diagnosing cancer. When a doctor suspects cancer, they take a sample and send it to a pathologist, who looks at it under a microscope. Now, pathologists look at hundreds of slides and millions of cells every day. So to support this task, some researchers made an AI classifier. Now, the classifier says, “Is this cancer or is this not cancer?” looking at the pictures. The classifier was pretty good, but not as good as the person, who got it right most of the time. 02:46But when they combine the ability of the machine and the human together, accuracy went to 99.5 percent. Adding that AI to a partnership eliminated 85 percent of the errors that the human pathologist would have made working alone. That’s a lot of cancer that would have otherwise gone untreated. Now, for the curious, it turns out that the human was better at rejecting false positives, and the machine was better at recognizing those hard-to-spot cases. But the lesson here isn’t about which agent is better at this image-classification task. Those things are changing every day. The lesson here is that by combining the abilities of the human and machine, it created a partnership that had superhuman performance. And that is humanistic AI. 03:42Now let’s look at another example with turbocharging performance. This is design. Now, let’s say you’re an engineer. You want to design a new frame for a drone. You get out your favorite software tools, CAD tools, and you enter the form and the materials, and then you analyze performance. That gives you one design. If you give those same tools to an AI, it can generate thousands of designs. 04:10footnotefootnoteThis video by Autodesk is amazing. This is real stuff. So this transforms how we do design. The human engineer now says what the design should achieve, and the machine says, “Here’s the possibilities.” Now in her job, the engineer’s job is to pick the one that best meets the goals of the design, which she knows as a human better than anyone else, using human judgment and expertise. In this case, the winning form looks kind of like something nature would have designed, minus a few million years of evolution and all that unnecessary fur. 04:48footnotefootnoteNow let’s see where this idea of humanistic AI might lead us if we follow it into the speculative beyond. What’s a kind of augmentation that we would all like to have? Well, how about cognitive enhancement? Instead of asking, “How smart can we make our machines?” let’s ask “How smart can our machines make us?” I mean, take memory for example. Memory is the foundation of human intelligence. But human memory is famously flawed. We’re great at telling stories, but not getting the details right. And our memories – they decay over time. I mean, like, where did the ’60s go, and can I go there, too? 05:34(Laughter) 05:36But what if you could have a memory that was as good as computer memory, and was about your life? What if you could remember every person you ever met, how to pronounce their name, their family details, their favorite sports, the last conversation you had with them? If you had this memory all your life, you could have the AI look at all the interactions you had with people over time and help you reflect on the long arc of your relationships. What if you could have the AI read everything you’ve ever read and listen to every song you’ve ever heard? From the tiniest clue, it could help you retrieve anything you’ve ever seen or heard before. Imagine what that would do for the ability to make new connections and form new ideas. 06:26And what about our bodies? What if we could remember the consequences of every food we eat, every pill we take, every all-nighter we pull? We could do our own science on our own data about what makes us feel good and stay healthy. And imagine how this could revolutionize the way we manage allergies and chronic disease. 06:52I believe that AI will make personal memory enhancement a reality. I can’t say when or what form factors are involved, but I think it’s inevitable, because the very things that make AI successful today – the availability of comprehensive data and the ability for machines to make sense of that data – can be applied to the data of our lives. And those data are here today, available for all of us, because we lead digitally mediated lives, in mobile and online. 07:31In my view, a personal memory is a private memory. We get to choose what is and is not recalled and retained. It’s absolutely essential that this be kept very secure. 07:45Now for most of us, the impact of augmented personal memory will be a more improved mental gain, maybe, hopefully, a bit more social grace. But for the millions who suffer from Alzheimer’s and dementia, the difference that augmented memory could make is a difference between a life of isolation and a life of dignity and connection. 08:12We are in the middle of a renaissance in artificial intelligence right now. I mean, in just the past few years, we’re beginning to see solutions to AI problems that we have struggled with literally for decades: speech understanding, text understanding, image understanding. We have a choice in how we use this powerful technology. We can choose to use AI to automate and compete with us, or we can use AI to augment and collaborate with us, to overcome our cognitive limitations and to help us do what we want to do, only better. And as we discover new ways to give machines intelligence, we can distribute that intelligence to all of the AI assistants in the world, and therefore to every person, regardless of circumstance. And that is why, every time a machine gets smarter, we get smarter. 09:21That is an AI worth spreading. 09:24Thank you. 09:25(Applause)
1 We’re building a dystopia just to make people click on ads Zeynep Tufekci Oct 2017 AI 1 Zeynep Tufekci Oct 2017 We’re building an artificial intelligence-powered dystopia, one click at a time, says techno-sociologist Zeynep Tufekci. In an eye-opening talk, she details how the same algorithms companies like Facebook, Google and Amazon use to get you to click on ads are also used to organize your access to political and social information. And the machines aren’t even the real threat. What we need to understand is how the powerful might use AI to control us – and what we can do in response.  (100K) Transcript (23 Languages)CatalàEnglishEspañolFrançaisHrvatskiItalianoMagyarNederlandsPolskiPortuguês brasileiroPortuguês de PortugalTürkçeΕλληνικάРусскийСрпски, SrpskiУкраїнськаעבריתالعربيةภาษาไทย中文 (简体)中文 (繁體)日本語한국어00:00So when people voice fears of artificial intelligence, very often, they invoke images of humanoid robots run amok. You know? Terminator? You know, that might be something to consider, but that’s a distant threat. Or, we fret about digital surveillance with metaphors from the past. “1984,” George Orwell’s “1984,” it’s hitting the bestseller lists again. It’s a great book, but it’s not the correct dystopia for the 21st century. What we need to fear most is not what artificial intelligence will do to us on its own, but how the people in power will use artificial intelligence to control us and to manipulate us in novel, sometimes hidden, subtle and unexpected ways. Much of the technology that threatens our freedom and our dignity in the near-term future is being developed by companies in the business of capturing and selling our data and our attention to advertisers and others: Facebook, Google, Amazon, Alibaba, Tencent. 01:14Now, artificial intelligence has started bolstering their business as well. And it may seem like artificial intelligence is just the next thing after online ads. It’s not. It’s a jump in category. It’s a whole different world, and it has great potential. It could accelerate our understanding of many areas of study and research. But to paraphrase a famous Hollywood philosopher, “With prodigious potential comes prodigious risk.” 01:49Now let’s look at a basic fact of our digital lives, online ads. Right? We kind of dismiss them. They seem crude, ineffective. We’ve all had the experience of being followed on the web by an ad based on something we searched or read. You know, you look up a pair of boots and for a week, those boots are following you around everywhere you go. Even after you succumb and buy them, they’re still following you around. We’re kind of inured to that kind of basic, cheap manipulation. We roll our eyes and we think, “You know what? These things don’t work.” Except, online, the digital technologies are not just ads. Now, to understand that, let’s think of a physical world example. You know how, at the checkout counters at supermarkets, near the cashier, there’s candy and gum at the eye level of kids? That’s designed to make them whine at their parents just as the parents are about to sort of check out. Now, that’s a persuasion architecture. It’s not nice, but it kind of works. That’s why you see it in every supermarket. Now, in the physical world, such persuasion architectures are kind of limited, because you can only put so many things by the cashier. Right? And the candy and gum, it’s the same for everyone, even though it mostly works only for people who have whiny little humans beside them. In the physical world, we live with those limitations. 03:22In the digital world, though, persuasion architectures can be built at the scale of billions and they can target, infer, understand and be deployed at individuals one by one by figuring out your weaknesses, and they can be sent to everyone’s phone private screen, so it’s not visible to us. And that’s different. And that’s just one of the basic things that artificial intelligence can do. 03:52Now, let’s take an example. Let’s say you want to sell plane tickets to Vegas. Right? So in the old world, you could think of some demographics to target based on experience and what you can guess. You might try to advertise to, oh, men between the ages of 25 and 35, or people who have a high limit on their credit card, or retired couples. Right? That’s what you would do in the past. 04:16With big data and machine learning, that’s not how it works anymore. So to imagine that, think of all the data that Facebook has on you: every status update you ever typed, every Messenger conversation, every place you logged in from, all your photographs that you uploaded there. If you start typing something and change your mind and delete it, Facebook keeps those and analyzes them, too. Increasingly, it tries to match you with your offline data. It also purchases a lot of data from data brokers. It could be everything from your financial records to a good chunk of your browsing history. Right? In the US, such data is routinely collected, collated and sold. In Europe, they have tougher rules. 05:11So what happens then is, by churning through all that data, these machine-learning algorithms – that’s why they’re called learning algorithms – they learn to understand the characteristics of people who purchased tickets to Vegas before. When they learn this from existing data, they also learn how to apply this to new people. So if they’re presented with a new person, they can classify whether that person is likely to buy a ticket to Vegas or not. Fine. You’re thinking, an offer to buy tickets to Vegas. I can ignore that. But the problem isn’t that. The problem is, we no longer really understand how these complex algorithms work. We don’t understand how they’re doing this categorization. It’s giant matrices, thousands of rows and columns, maybe millions of rows and columns, and not the programmers and not anybody who looks at it, even if you have all the data, understands anymore how exactly it’s operating any more than you’d know what I was thinking right now if you were shown a cross section of my brain. It’s like we’re not programming anymore, we’re growing intelligence that we don’t truly understand. 06:40And these things only work if there’s an enormous amount of data, so they also encourage deep surveillance on all of us so that the machine learning algorithms can work. That’s why Facebook wants to collect all the data it can about you. The algorithms work better. 06:56So let’s push that Vegas example a bit. What if the system that we do not understand was picking up that it’s easier to sell Vegas tickets to people who are bipolar and about to enter the manic phase. Such people tend to become overspenders, compulsive gamblers. They could do this, and you’d have no clue that’s what they were picking up on. I gave this example to a bunch of computer scientists once and afterwards, one of them came up to me. He was troubled and he said, “That’s why I couldn’t publish it.” I was like, “Couldn’t publish what?” He had tried to see whether you can indeed figure out the onset of mania from social media posts before clinical symptoms, and it had worked, and it had worked very well, and he had no idea how it worked or what it was picking up on. 07:54Now, the problem isn’t solved if he doesn’t publish it, because there are already companies that are developing this kind of technology, and a lot of the stuff is just off the shelf. This is not very difficult anymore. 08:09Do you ever go on YouTube meaning to watch one video and an hour later you’ve watched 27? You know how YouTube has this column on the right that says, “Up next” and it autoplays something? It’s an algorithm picking what it thinks that you might be interested in and maybe not find on your own. It’s not a human editor. It’s what algorithms do. It picks up on what you have watched and what people like you have watched, and infers that that must be what you’re interested in, what you want more of, and just shows you more. It sounds like a benign and useful feature, except when it isn’t. 08:49So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him. I study social movements, so I was studying it, too. And then I wanted to write something about one of his rallies, so I watched it a few times on YouTube. YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism. If I watched one, it served up one even more extreme and autoplayed that one, too. If you watch Hillary Clinton or Bernie Sanders content, YouTube recommends and autoplays conspiracy left, and it goes downhill from there. 09:40Well, you might be thinking, this is politics, but it’s not. This isn’t about politics. This is just the algorithm figuring out human behavior. I once watched a video about vegetarianism on YouTube and YouTube recommended and autoplayed a video about being vegan. It’s like you’re never hardcore enough for YouTube. 10:00(Laughter) 10:02So what’s going on? Now, YouTube’s algorithm is proprietary, but here’s what I think is going on. The algorithm has figured out that if you can entice people into thinking that you can show them something more hardcore, they’re more likely to stay on the site watching video after video going down that rabbit hole while Google serves them ads. Now, with nobody minding the ethics of the store, these sites can profile people who are Jew haters, who think that Jews are parasites and who have such explicit anti-Semitic content, and let you target them with ads. They can also mobilize algorithms to find for you look-alike audiences, people who do not have such explicit anti-Semitic content on their profile but who the algorithm detects may be susceptible to such messages, and lets you target them with ads, too. Now, this may sound like an implausible example, but this is real. ProPublica investigated this and found that you can indeed do this on Facebook, and Facebook helpfully offered up suggestions on how to broaden that audience. BuzzFeed tried it for Google, and very quickly they found, yep, you can do it on Google, too. And it wasn’t even expensive. The ProPublica reporter spent about 30 dollars to target this category. 11:50So last year, Donald Trump’s social media manager disclosed that they were using Facebook dark posts to demobilize people, not to persuade them, but to convince them not to vote at all. And to do that, they targeted specifically, for example, African-American men in key cities like Philadelphia, and I’m going to read exactly what he said. I’m quoting. 12:17They were using “nonpublic posts whose viewership the campaign controls so that only the people we want to see it see it. We modeled this. It will dramatically affect her ability to turn these people out.” 12:33What’s in those dark posts? We have no idea. Facebook won’t tell us. 12:40So Facebook also algorithmically arranges the posts that your friends put on Facebook, or the pages you follow. It doesn’t show you everything chronologically. It puts the order in the way that the algorithm thinks will entice you to stay on the site longer. 12:59Now, so this has a lot of consequences. You may be thinking somebody is snubbing you on Facebook. The algorithm may never be showing your post to them. The algorithm is prioritizing some of them and burying the others. 13:17Experiments show that what the algorithm picks to show you can affect your emotions. But that’s not all. It also affects political behavior. So in 2010, in the midterm elections, Facebook did an experiment on 61 million people in the US that was disclosed after the fact. So some people were shown, “Today is election day,” the simpler one, and some people were shown the one with that tiny tweak with those little thumbnails of your friends who clicked on “I voted.” This simple tweak. OK? So the pictures were the only change, and that post shown just once turned out an additional 340,000 voters in that election, according to this research as confirmed by the voter rolls. A fluke? No. Because in 2012, they repeated the same experiment. And that time, that civic message shown just once turned out an additional 270,000 voters. For reference, the 2016 US presidential election was decided by about 100,000 votes. Now, Facebook can also very easily infer what your politics are, even if you’ve never disclosed them on the site. Right? These algorithms can do that quite easily. What if a platform with that kind of power decides to turn out supporters of one candidate over the other? How would we even know about it? 15:13Now, we started from someplace seemingly innocuous – online adds following us around – and we’ve landed someplace else. As a public and as citizens, we no longer know if we’re seeing the same information or what anybody else is seeing, and without a common basis of information, little by little, public debate is becoming impossible, and we’re just at the beginning stages of this. These algorithms can quite easily infer things like your people’s ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age and genders, just from Facebook likes. These algorithms can identify protesters even if their faces are partially concealed. These algorithms may be able to detect people’s sexual orientation just from their dating profile pictures. 16:21Now, these are probabilistic guesses, so they’re not going to be 100 percent right, but I don’t see the powerful resisting the temptation to use these technologies just because there are some false positives, which will of course create a whole other layer of problems. Imagine what a state can do with the immense amount of data it has on its citizens. China is already using face detection technology to identify and arrest people. And here’s the tragedy: we’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads. And this won’t be Orwell’s authoritarianism. This isn’t “1984.” Now, if authoritarianism is using overt fear to terrorize us, we’ll all be scared, but we’ll know it, we’ll hate it and we’ll resist it. But if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels, to deploy persuasion architectures at scale and to manipulate individuals one by one using their personal, individual weaknesses and vulnerabilities, and if they’re doing it at scale through our private screens so that we don’t even know what our fellow citizens and neighbors are seeing, that authoritarianism will envelop us like a spider’s web and we may not even know we’re in it. 18:10So Facebook’s market capitalization is approaching half a trillion dollars. It’s because it works great as a persuasion architecture. But the structure of that architecture is the same whether you’re selling shoes or whether you’re selling politics. The algorithms do not know the difference. The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows, and that’s what’s got to change. 18:50Now, don’t get me wrong, we use digital platforms because they provide us with great value. I use Facebook to keep in touch with friends and family around the world. I’ve written about how crucial social media is for social movements. I have studied how these technologies can be used to circumvent censorship around the world. But it’s not that the people who run, you know, Facebook or Google are maliciously and deliberately trying to make the country or the world more polarized and encourage extremism. I read the many well-intentioned statements that these people put out. But it’s not the intent or the statements people in technology make that matter, it’s the structures and business models they’re building. And that’s the core of the problem. Either Facebook is a giant con of half a trillion dollars and ads don’t work on the site, it doesn’t work as a persuasion architecture, or its power of influence is of great concern. It’s either one or the other. It’s similar for Google, too. 20:12So what can we do? This needs to change. Now, I can’t offer a simple recipe, because we need to restructure the whole way our digital technology operates. Everything from the way technology is developed to the way the incentives, economic and otherwise, are built into the system. We have to face and try to deal with the lack of transparency created by the proprietary algorithms, the structural challenge of machine learning’s opacity, all this indiscriminate data that’s being collected about us. We have a big task in front of us. We have to mobilize our technology, our creativity and yes, our politics so that we can build artificial intelligence that supports us in our human goals but that is also constrained by our human values. And I understand this won’t be easy. We might not even easily agree on what those terms mean. But if we take seriously how these systems that we depend on for so much operate, I don’t see how we can postpone this conversation anymore. These structures are organizing how we function and they’re controlling what we can and we cannot do. And many of these ad-financed platforms, they boast that they’re free. In this context, it means that we are the product that’s being sold. We need a digital economy where our data and our attention is not for sale to the highest-bidding authoritarian or demagogue. 22:11(Applause) 22:18So to go back to that Hollywood paraphrase, we do want the prodigious potential of artificial intelligence and digital technology to blossom, but for that, we must face this prodigious menace, open-eyed and now. 22:36Thank you. 22:37(Applause)
1 How AI can save our humanity Kai-Fu Lee Aug 2018 AI 1 Kai-Fu Lee Aug 2018 AI is massively transforming our world, but there’s one thing it cannot do: love. In a visionary talk, computer scientist Kai-Fu Lee details how the US and China are driving a deep learning revolution – and shares a blueprint for how humans can thrive in the age of AI by harnessing compassion and creativity. “AI is serendipity,” Lee says. “It is here to liberate us from routine jobs, and it is here to remind us what it is that makes us human.”  (123K) Transcript (23 Languages)DeutschEnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroPortuguês de PortugalTiếng ViệtTürkçeΕλληνικάРусскийעבריתالعربيةفارسىکوردی سۆرانیગુજરાતીภาษาไทย中文 (简体)中文 (繁體)日本語한국어FootnotesFootnotes00:00I’m going to talk about how AI and mankind can coexist, but first, we have to rethink about our human values. So let me first make a confession about my errors in my values. 00:13It was 11 o’clock, December 16, 1991. I was about to become a father for the first time. My wife, Shen-Ling, lay in the hospital bed going through a very difficult 12-hour labor. I sat by her bedside but looked anxiously at my watch, and I knew something that she didn’t. I knew that if in one hour, our child didn’t come, I was going to leave her there and go back to work and make a presentation about AI to my boss, Apple’s CEO. Fortunately, my daughter was born at 11:30 – 00:55(Laughter) 00:57(Applause) 00:59sparing me from doing the unthinkable, and to this day, I am so sorry for letting my work ethic take precedence over love for my family. 01:10(Applause) 01:16My AI talk, however, went off brilliantly. 01:18(Laughter) 01:21Apple loved my work and decided to announce it at TED1992, 26 years ago on this very stage. I thought I had made one of the biggest, most important discoveries in AI, and so did the “Wall Street Journal” on the following day. 01:39footnotefootnoteBut as far as discoveries went, it turned out, I didn’t discover India, or America. Perhaps I discovered a little island off of Portugal. But the AI era of discovery continued, and more scientists poured their souls into it. About 10 years ago, the grand AI discovery was made by three North American scientists, and it’s known as deep learning. 02:05Deep learning is a technology that can take a huge amount of data within one single domain and learn to predict or decide at superhuman accuracy. For example, if we show the deep learning network a massive number of food photos, it can recognize food such as hot dog or no hot dog. 02:26(Applause) 02:29Or if we show it many pictures and videos and sensor data from driving on the highway, it can actually drive a car as well as a human being on the highway. And what if we showed this deep learning network all the speeches made by President Trump? Then this artificially intelligent President Trump, actually the network – 02:55(Laughter) 02:57can – 02:58(Applause) 03:02You like double oxymorons, huh? 03:05(Laughter) 03:09(Applause) 03:15So this network, if given the request to make a speech about AI, he, or it, might say – 03:24(Recording) Donald Trump: It’s a great thing to build a better world with artificial intelligence. 03:29Kai-Fu Lee: And maybe in another language? 03:31DT: (Speaking Chinese) 03:33(Laughter) 03:34KFL: You didn’t know he knew Chinese, did you? 03:38So deep learning has become the core in the era of AI discovery, and that’s led by the US. But we’re now in the era of implementation, where what really matters is execution, product quality, speed and data. And that’s where China comes in. Chinese entrepreneurs, who I fund as a venture capitalist, are incredible workers, amazing work ethic. My example in the delivery room is nothing compared to how hard people work in China. As an example, one startup tried to claim work-life balance: “Come work for us because we are 996.” And what does that mean? It means the work hours of 9am to 9pm, six days a week. That’s contrasted with other startups that do 997. 04:27And the Chinese product quality has consistently gone up in the past decade, and that’s because of a fiercely competitive environment. In Silicon Valley, entrepreneurs compete in a very gentlemanly fashion, sort of like in old wars in which each side took turns to fire at each other. 04:47(Laughter) 04:48But in the Chinese environment, it’s truly a gladiatorial fight to the death. In such a brutal environment, entrepreneurs learn to grow very rapidly, they learn to make their products better at lightning speed, and they learn to hone their business models until they’re impregnable. As a result, great Chinese products like WeChat and Weibo are arguably better than the equivalent American products from Facebook and Twitter. 05:19footnotefootnoteAnd the Chinese market embraces this change and accelerated change and paradigm shifts. As an example, if any of you go to China, you will see it’s almost cashless and credit card-less, because that thing that we all talk about, mobile payment, has become the reality in China. In the last year, 18.8 trillion US dollars were transacted on mobile internet, and that’s because of very robust technologies built behind it. It’s even bigger than the China GDP. And this technology, you can say, how can it be bigger than the GDP? Because it includes all transactions: wholesale, channels, retail, online, offline, going into a shopping mall or going into a farmers market like this. The technology is used by 700 million people to pay each other, not just merchants, so it’s peer to peer, and it’s almost transaction-fee-free. And it’s instantaneous, and it’s used everywhere. And finally, the China market is enormous. This market is large, which helps give entrepreneurs more users, more revenue, more investment, but most importantly, it gives the entrepreneurs a chance to collect a huge amount of data which becomes rocket fuel for the AI engine. So as a result, the Chinese AI companies have leaped ahead so that today, the most valuable companies in computer vision, speech recognition, speech synthesis, machine translation and drones are all Chinese companies. 06:59footnotefootnoteSo with the US leading the era of discovery and China leading the era of implementation, we are now in an amazing age where the dual engine of the two superpowers are working together to drive the fastest revolution in technology that we have ever seen as humans. And this will bring tremendous wealth, unprecedented wealth: 16 trillion dollars, according to PwC, in terms of added GDP to the worldwide GDP by 2030. It will also bring immense challenges in terms of potential job replacements. Whereas in the Industrial Age it created more jobs because craftsman jobs were being decomposed into jobs in the assembly line, so more jobs were created. But AI completely replaces the individual jobs in the assembly line with robots. And it’s not just in factories, but truckers, drivers and even jobs like telesales, customer service and hematologists as well as radiologists over the next 15 years are going to be gradually replaced by artificial intelligence. And only the creative jobs – 08:20(Laughter) 08:22I have to make myself safe, right? Really, the creative jobs are the ones that are protected, because AI can optimize but not create. 08:33But what’s more serious than the loss of jobs is the loss of meaning, because the work ethic in the Industrial Age has brainwashed us into thinking that work is the reason we exist, that work defined the meaning of our lives. And I was a prime and willing victim to that type of workaholic thinking. I worked incredibly hard. That’s why I almost left my wife in the delivery room, that’s why I worked 996 alongside my entrepreneurs. And that obsession that I had with work ended abruptly a few years ago when I was diagnosed with fourth stage lymphoma. The PET scan here shows over 20 malignant tumors jumping out like fireballs, melting away my ambition. But more importantly, it helped me reexamine my life. Knowing that I may only have a few months to live caused me to see how foolish it was for me to base my entire self-worth on how hard I worked and the accomplishments from hard work. My priorities were completely out of order. I neglected my family. My father had passed away, and I never had a chance to tell him I loved him. My mother had dementia and no longer recognized me, and my children had grown up. 10:04During my chemotherapy, I read a book by Bronnie Ware who talked about dying wishes and regrets of the people in the deathbed. She found that facing death, nobody regretted that they didn’t work hard enough in this life. They only regretted that they didn’t spend enough time with their loved ones and that they didn’t spread their love. 10:30So I am fortunately today in remission. 10:34(Applause) 10:41So I can be back at TED again to share with you that I have changed my ways. I now only work 965 – occasionally 996, but usually 965. I moved closer to my mother, my wife usually travels with me, and when my kids have vacation, if they don’t come home, I go to them. So it’s a new form of life that helped me recognize how important it is that love is for me, and facing death helped me change my life, but it also helped me see a new way of how AI should impact mankind and work and coexist with mankind, that really, AI is taking away a lot of routine jobs, but routine jobs are not what we’re about. 11:32Why we exist is love. When we hold our newborn baby, love at first sight, or when we help someone in need, humans are uniquely able to give and receive love, and that’s what differentiates us from AI. 11:48Despite what science fiction may portray, I can responsibly tell you that AI has no love. When AlphaGo defeated the world champion Ke Jie, while Ke Jie was crying and loving the game of go, AlphaGo felt no happiness from winning and certainly no desire to hug a loved one. 12:11So how do we differentiate ourselves as humans in the age of AI? We talked about the axis of creativity, and certainly that is one possibility, and now we introduce a new axis that we can call compassion, love, or empathy. Those are things that AI cannot do. So as AI takes away the routine jobs, I like to think we can, we should and we must create jobs of compassion. You might ask how many of those there are, but I would ask you: Do you not think that we are going to need a lot of social workers to help us make this transition? Do you not think we need a lot of compassionate caregivers to give more medical care to more people? Do you not think we’re going to need 10 times more teachers to help our children find their way to survive and thrive in this brave new world? And with all the newfound wealth, should we not also make labors of love into careers and let elderly accompaniment or homeschooling become careers also? 13:18(Applause) 13:24This graph is surely not perfect, but it points at four ways that we can work with AI. AI will come and take away the routine jobs and in due time, we will be thankful. AI will become great tools for the creatives so that scientists, artists, musicians and writers can be even more creative. AI will work with humans as analytical tools that humans can wrap their warmth around for the high-compassion jobs. And we can always differentiate ourselves with the uniquely capable jobs that are both compassionate and creative, using and leveraging our irreplaceable brains and hearts. So there you have it: a blueprint of coexistence for humans and AI. 14:15AI is serendipity. It is here to liberate us from routine jobs, and it is here to remind us what it is that makes us human. So let us choose to embrace AI and to love one another. 14:28Thank you. 14:29(Applause)
1 Why fascism is so tempting — and how your data could power it Yuval Noah Harari May 2018 AI 1 Yuval Noah Harari May 2018 In a profound talk about technology and power, author and historian Yuval Noah Harari explains the important difference between fascism and nationalism – and what the consolidation of our data means for the future of democracy. Appearing as a hologram live from Tel Aviv, Harari warns that the greatest danger that now faces liberal democracy is that the revolution in information technology will make dictatorships more efficient and capable of control. “The enemies of liberal democ…  (139K) Transcript (31 Languages)DeutschEnglishEspañolFrançaisHrvatskiItalianoLatviešuMagyarNederlandsPolskiPortuguês brasileiroPortuguês de PortugalRomânăSuomiTürkçeΕλληνικάРусскийСрпски, SrpskiУкраїнськабългарскиעבריתالعربيةفارسىکوردی سۆرانیکورمانجیภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)日本語한국어00:00Hello, everyone. It’s a bit funny, because I did write that humans will become digital, but I didn’t think it will happen so fast and that it will happen to me. But here I am, as a digital avatar, and here you are, so let’s start. And let’s start with a question. How many fascists are there in the audience today? 00:26(Laughter) 00:27Well, it’s a bit difficult to say, because we’ve forgotten what fascism is. People now use the term “fascist” as a kind of general-purpose abuse. Or they confuse fascism with nationalism. So let’s take a few minutes to clarify what fascism actually is, and how it is different from nationalism. 00:53The milder forms of nationalism have been among the most benevolent of human creations. Nations are communities of millions of strangers who don’t really know each other. For example, I don’t know the eight million people who share my Israeli citizenship. But thanks to nationalism, we can all care about one another and cooperate effectively. This is very good. Some people, like John Lennon, imagine that without nationalism, the world will be a peaceful paradise. But far more likely, without nationalism, we would have been living in tribal chaos. If you look today at the most prosperous and peaceful countries in the world, countries like Sweden and Switzerland and Japan, you will see that they have a very strong sense of nationalism. In contrast, countries that lack a strong sense of nationalism, like Congo and Somalia and Afghanistan, tend to be violent and poor. 02:04So what is fascism, and how is it different from nationalism? Well, nationalism tells me that my nation is unique, and that I have special obligations towards my nation. Fascism, in contrast, tells me that my nation is supreme, and that I have exclusive obligations towards it. I don’t need to care about anybody or anything other than my nation. Usually, of course, people have many identities and loyalties to different groups. For example, I can be a good patriot, loyal to my country, and at the same time, be loyal to my family, my neighborhood, my profession, humankind as a whole, truth and beauty. Of course, when I have different identities and loyalties, it sometimes creates conflicts and complications. But, well, who ever told you that life was easy? Life is complicated. Deal with it. 03:14Fascism is what happens when people try to ignore the complications and to make life too easy for themselves. Fascism denies all identities except the national identity and insists that I have obligations only towards my nation. If my nation demands that I sacrifice my family, then I will sacrifice my family. If the nation demands that I kill millions of people, then I will kill millions of people. And if my nation demands that I betray truth and beauty, then I should betray truth and beauty. For example, how does a fascist evaluate art? How does a fascist decide whether a movie is a good movie or a bad movie? Well, it’s very, very, very simple. There is really just one yardstick: if the movie serves the interests of the nation, it’s a good movie; if the movie doesn’t serve the interests of the nation, it’s a bad movie. That’s it. Similarly, how does a fascist decide what to teach kids in school? Again, it’s very simple. There is just one yardstick: you teach the kids whatever serves the interests of the nation. The truth doesn’t matter at all. 04:48Now, the horrors of the Second World War and of the Holocaust remind us of the terrible consequences of this way of thinking. But usually, when we talk about the ills of fascism, we do so in an ineffective way, because we tend to depict fascism as a hideous monster, without really explaining what was so seductive about it. It’s a bit like these Hollywood movies that depict the bad guys – Voldemort or Sauron or Darth Vader – as ugly and mean and cruel. They’re cruel even to their own supporters. When I see these movies, I never understand – why would anybody be tempted to follow a disgusting creep like Voldemort? The problem with evil is that in real life, evil doesn’t necessarily look ugly. It can look very beautiful. This is something that Christianity knew very well, which is why in Christian art, as [opposed to] Hollywood, Satan is usually depicted as a gorgeous hunk. This is why it’s so difficult to resist the temptations of Satan, and why it is also difficult to resist the temptations of fascism. 06:10Fascism makes people see themselves as belonging to the most beautiful and most important thing in the world – the nation. And then people think, “Well, they taught us that fascism is ugly. But when I look in the mirror, I see something very beautiful, so I can’t be a fascist, right?” Wrong. That’s the problem with fascism. When you look in the fascist mirror, you see yourself as far more beautiful than you really are. In the 1930s, when Germans looked in the fascist mirror, they saw Germany as the most beautiful thing in the world. If today, Russians look in the fascist mirror, they will see Russia as the most beautiful thing in the world. And if Israelis look in the fascist mirror, they will see Israel as the most beautiful thing in the world. This does not mean that we are now facing a rerun of the 1930s. 07:12Fascism and dictatorships might come back, but they will come back in a new form, a form which is much more relevant to the new technological realities of the 21st century. In ancient times, land was the most important asset in the world. Politics, therefore, was the struggle to control land. And dictatorship meant that all the land was owned by a single ruler or by a small oligarch. And in the modern age, machines became more important than land. Politics became the struggle to control the machines. And dictatorship meant that too many of the machines became concentrated in the hands of the government or of a small elite. Now data is replacing both land and machines as the most important asset. Politics becomes the struggle to control the flows of data. And dictatorship now means that too much data is being concentrated in the hands of the government or of a small elite. 08:28The greatest danger that now faces liberal democracy is that the revolution in information technology will make dictatorships more efficient than democracies. 08:42In the 20th century, democracy and capitalism defeated fascism and communism because democracy was better at processing data and making decisions. Given 20th-century technology, it was simply inefficient to try and concentrate too much data and too much power in one place. 09:07But it is not a law of nature that centralized data processing is always less efficient than distributed data processing. With the rise of artificial intelligence and machine learning, it might become feasible to process enormous amounts of information very efficiently in one place, to take all the decisions in one place, and then centralized data processing will be more efficient than distributed data processing. And then the main handicap of authoritarian regimes in the 20th century – their attempt to concentrate all the information in one place – it will become their greatest advantage. 09:58Another technological danger that threatens the future of democracy is the merger of information technology with biotechnology, which might result in the creation of algorithms that know me better than I know myself. And once you have such algorithms, an external system, like the government, cannot just predict my decisions, it can also manipulate my feelings, my emotions. A dictator may not be able to provide me with good health care, but he will be able to make me love him and to make me hate the opposition. Democracy will find it difficult to survive such a development because, in the end, democracy is not based on human rationality; it’s based on human feelings. During elections and referendums, you’re not being asked, “What do you think?” You’re actually being asked, “How do you feel?” And if somebody can manipulate your emotions effectively, democracy will become an emotional puppet show. 11:18So what can we do to prevent the return of fascism and the rise of new dictatorships? The number one question that we face is: Who controls the data? If you are an engineer, then find ways to prevent too much data from being concentrated in too few hands. And find ways to make sure the distributed data processing is at least as efficient as centralized data processing. This will be the best safeguard for democracy. As for the rest of us who are not engineers, the number one question facing us is how not to allow ourselves to be manipulated by those who control the data. 12:11The enemies of liberal democracy, they have a method. They hack our feelings. Not our emails, not our bank accounts – they hack our feelings of fear and hate and vanity, and then use these feelings to polarize and destroy democracy from within. This is actually a method that Silicon Valley pioneered in order to sell us products. But now, the enemies of democracy are using this very method to sell us fear and hate and vanity. They cannot create these feelings out of nothing. So they get to know our own preexisting weaknesses. And then use them against us. And it is therefore the responsibility of all of us to get to know our weaknesses and make sure that they do not become a weapon in the hands of the enemies of democracy. 13:15Getting to know our own weaknesses will also help us to avoid the trap of the fascist mirror. As we explained earlier, fascism exploits our vanity. It makes us see ourselves as far more beautiful than we really are. This is the seduction. But if you really know yourself, you will not fall for this kind of flattery. If somebody puts a mirror in front of your eyes that hides all your ugly bits and makes you see yourself as far more beautiful and far more important than you really are, just break that mirror. 14:01Thank you. 14:02(Applause) 14:10Chris Anderson: Yuval, thank you. Goodness me. It’s so nice to see you again. So, if I understand you right, you’re alerting us to two big dangers here. One is the possible resurgence of a seductive form of fascism, but close to that, dictatorships that may not exactly be fascistic, but control all the data. I wonder if there’s a third concern that some people here have already expressed, which is where, not governments, but big corporations control all our data. What do you call that, and how worried should we be about that? 14:44Yuval Noah Harari: Well, in the end, there isn’t such a big difference between the corporations and the governments, because, as I said, the questions is: Who controls the data? This is the real government. If you call it a corporation or a government – if it’s a corporation and it really controls the data, this is our real government. So the difference is more apparent than real. 15:06CA: But somehow, at least with corporations, you can imagine market mechanisms where they can be taken down. I mean, if consumers just decide that the company is no longer operating in their interest, it does open the door to another market. It seems easier to imagine that than, say, citizens rising up and taking down a government that is in control of everything. 15:25YNH: Well, we are not there yet, but again, if a corporation really knows you better than you know yourself – at least that it can manipulate your own deepest emotions and desires, and you won’t even realize – you will think this is your authentic self. So in theory, yes, in theory, you can rise against a corporation, just as, in theory, you can rise against a dictatorship. But in practice, it is extremely difficult. 15:55CA: So in “Homo Deus,” you argue that this would be the century when humans kind of became gods, either through development of artificial intelligence or through genetic engineering. Has this prospect of political system shift, collapse impacted your view on that possibility? 16:17YNH: Well, I think it makes it even more likely, and more likely that it will happen faster, because in times of crisis, people are willing to take risks that they wouldn’t otherwise take. And people are willing to try all kinds of high-risk, high-gain technologies. So these kinds of crises might serve the same function as the two world wars in the 20th century. The two world wars greatly accelerated the development of new and dangerous technologies. And the same thing might happen in the 21st century. I mean, you need to be a little crazy to run too fast, let’s say, with genetic engineering. But now you have more and more crazy people in charge of different countries in the world, so the chances are getting higher, not lower. 17:11CA: So, putting it all together, Yuval, you’ve got this unique vision. Roll the clock forward 30 years. What’s your guess – does humanity just somehow scrape through, look back and say, “Wow, that was a close thing. We did it!” Or not? 17:24YNH: So far, we’ve managed to overcome all the previous crises. And especially if you look at liberal democracy and you think things are bad now, just remember how much worse things looked in 1938 or in 1968. So this is really nothing, this is just a small crisis. But you can never know, because, as a historian, I know that you should never underestimate human stupidity. 17:53(Laughter) (Applause) 17:54It is one of the most powerful forces that shape history. 17:59CA: Yuval, it’s been an absolute delight to have you with us. Thank you for making the virtual trip. Have a great evening there in Tel Aviv. Yuval Harari! 18:07YNH: Thank you very much. 18:08(Applause)
kable(views_add[1:10,], caption = "The example of original add_details_1 table") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
The example of original add_details_1 table
page_title views_details
How does artificial intelligence learn? 513,440 views | Briana Brownell • TED-Ed
The danger of AI is weirder than you think 3,170,321 views | Janelle Shane • TED2019
The wonderful and terrifying implications of computers that can learn 2,693,800 views | Jeremy Howard • TEDxBrussels
How do we find dignity at work? 2,204,611 views | Roy Bahat and Bryn Freedman • TED Salon: Zebra Technologies
The incredible inventions of intuitive AI 7,349,317 views | Maurice Conti • TEDxPortland
How AI can bring on a second Industrial Revolution 1,864,143 views | Kevin Kelly • TEDSummit
How AI can enhance our memory, work and social lives 2,166,189 views | Tom Gruber • TED2017
We’re building a dystopia just to make people click on ads 3,353,192 views | Zeynep Tufekci • TEDGlobal>NYC
How AI can save our humanity 4,115,567 views | Kai-Fu Lee • TED2018
Why fascism is so tempting — and how your data could power it 4,641,498 views | Yuval Noah Harari • TED2018


We then remove any duplicated observations and combine the two tables by matching the “title” column. The resulting table, which we name TED, containing 324 observations and 12 variables. However, for the purpose of our analysis, we are only interested in six specific variables: title of videos (title), posting time (posted), topic of videos (cate), the number of likes for videos (likes), transcript (transcript), and the number of views of videos (views_details). Therefore, we select these variables and removed the rest, creating a new table called TED_sentiment that would serve as the main table for our sentiment analysis. We also remove the title variable from the TED table, as it would not be used in the sentiment analysis.

# Delete duplicate rows of TED
#sum(duplicated(TED)) #6
TED <- TED[!duplicated(TED), ]

# Delete duplicate rows of views_add
#unique(views_add$page_title) #304 so duplicate title = 6
views_add <- views_add[!duplicated(views_add), ] 

# Combine tables
TED <- left_join(TED,views_add, by = c("title"="page_title")) 
TED <- TED %>% as_tibble(data.frame(TED)) %>%
  select(title, views_times.x, cate, likes, tanscript, views_details)
colnames(TED)[2] <- "posted"

# Identify NA and remove
# checkna <- TED[is.na(TED$tanscript), ]
# numtotal <- data.frame(table(TED$cate))
# numna <- data.frame(table(checkna$cate))
# diff <- numtotal %>% left_join(numna, by = "Var1") %>%
#   mutate(diff = Freq.x-Freq.y)
TED <- na.omit(TED) #286
catenum <- data.frame(table(TED$cate))
colnames(catenum) <- c("Topics", "Count")

kbl(catenum[,], caption = "The number of videos per topics") %>%
   kable_paper(bootstrap_options = "striped", full_width = F, position = "float_left") 
The number of videos per topics
Topics Count
AI 103
Climate change 86
Relationships 97

After performing these operations, we discover that the TED table containing 34 missing values (NAs), which we subsequently remove. The resulting TED table containing 286 observations, representing 103 videos on AI, 86 videos on Climate change, and 97 videos on Relationships. This table would serve as the basis for our further analyses.

Subsequently, it turns to data parsing step. We convert posting time, the number of likes, the number of views to be in the appropriate format for further analyses. For example, the posting time for the first video, “How does artificial intelligence learn?”, is Mar 2021 in the original TED table. It is converted to be 2021-03-01.

For the transcript, there are the number of translated languages and the details of the translation at the beginning of the transcript every video. In this project, we focus only the actual transcript. Thus, we remove this part out of the transcript. For example, the first sentence of the transcript in the first video,“How does artificial intelligence learn?”, is “Transcript (28 Languages)Bahasa IndonesiaDeutschEnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroPortuguês de PortugalRomânăTiếng ViệtTürkçeΕλληνικάРусскийСрпски, Srpskiעבריתالعربيةفارسىکوردی سۆرانیবাংলাதமிழ்ภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)日本語한국어”. We remove this part out of the transcript.

# Parse data
TED$posted <- my(TED$posted)
TED$cate[which(TED$cate=="AI")] <- "1"
TED$cate[which(TED$cate=="Climate change")] <- "2"
TED$cate[which(TED$cate=="Relationships")] <- "3"
TED$likes <- gsub("[(]",'',TED$likes)
TED$likes <- gsub("[)]",'',TED$likes)
x <- substr(TED$likes[1],1,1)
TED$likes <- gsub(x,'',TED$likes)
TED$likes <- gsub("K", "e3", TED$likes)
TED$likes <- gsub("M", "e6", TED$likes)
TED$likes <- as.numeric(TED$likes)

# Clean number of views
# first separate the views detail into two parts (before "views" after "views")
views_time <- as.character()
for (i in 1:length(TED$views_details)) {
  views_temp <- TED$views_details[i]
  views_temp <- strsplit(views_temp, "views")
  views_temp <- views_temp[[1]][1]
  views_time <- append(views_time,views_temp)
}
views_time <- gsub(" ","",views_time)
views_time <- gsub(",","",views_time)
TED$views_details <- as.numeric(views_time)

# Clean transcript
TED$tanscript <- gsub("^.+?00:(.*)","\\1",TED$tanscript)
TED$tanscript <- gsub("\r\n"," ",TED$tanscript)
TED$tanscript <- gsub("[[:digit:]]"," ",TED$tanscript)

# extract on version of TED for sentiment part
TED_sentiment <- TED
# no need for title column in the following analysis
TED <- TED %>% select(-title)

In order to facilitate data analysis, we assign numerical values to the categories of AI, Climate Change, and Relationships in the TED dataset. The variable cate is used to represent these categories, with AI being represented by number 1, Climate Change by number 2, and Relationships by number 3. This allows for easier tracking of the videos in both supervised and unsupervised learning analyses.

Due to the limited number of available videos within the selected categories on the TED website, we are unable to gather a larger dataset for unsupervised and supervised learning analyses. In order to increase the number of observations while still avoiding overfitting and striving for a robust model, we decide to set a window of 20 sentences to be equal to one observation. This is based on the observation that the transcript for each video typically contains more than 20 sentences.

We split sentences by using the tokenize_sentence() function from the quanteda package and create a new variable, namely sub cate. For example, sub_cate of 1.1 indicates that the observation is from the first transcript in the AI category (The first topic). We then create a text variable to uniquely identify each text, with the format X.Y.Z indicating the Zth segment of 20 sentences in the Yth transcript of the Xth category. By using this approach, the number of observations increases from 286 to 1,471 and we name this data frame as TED_full. To sum up, TED_full consists of 1,471 observations with 7 variables which are posted, cate, like, view, subcate, text, transcript.

# Increase the number of instances: 20 sentences = 1 instance
TED_full <- TED[0,]
TED_full$subcate <- TED_full$cate #new col but same type
TED_full$text <- TED_full$cate
n_transcript <- length(TED$tanscript)

sub_cate_1 = 0
sub_cate_2 = 0
sub_cate_3 = 0

for (i in 1:n_transcript) {
  
  if (TED$cate[i] == "1") {
    sub_cate_1 <- sub_cate_1 + 1
    subcat_temp =  paste(TED$cate[i],".",as.character(sub_cate_1), sep = "", collapse = "")
  } 
  else if (TED$cate[i] == "2") {
    sub_cate_2 <- sub_cate_2 + 1
    subcat_temp =  paste(TED$cate[i],".",as.character(sub_cate_2), sep = "", collapse = "")
  }
  else {
    sub_cate_3 <- sub_cate_3 + 1
    subcat_temp =  paste(TED$cate[i],".",as.character(sub_cate_3), sep = "", collapse = "")
  }
   
  transcript_i <- TED$tanscript[i]
  transcript_i_sentence <- unlist(tokenize_sentence(transcript_i))
  n_sen <- length(transcript_i_sentence)
  n_group <- ceiling(n_sen/20)
  for (j in 1:n_group) {
    if (j == n_group) {
      sentence_temp <- paste(transcript_i_sentence[((j-1)*20+1):(n_sen)], collapse = " ")
    } 
    else {
      sentence_temp <- paste(transcript_i_sentence[((j-1)*20+1):(j*20)], collapse = " ")
    }
    
    text_temp = paste(subcat_temp,".",as.character(j), sep = "", collapse = "")
    TED_temp <- data.frame(posted = TED$posted[i], cate = TED$cate[i], like = TED$likes[i], view = TED$views_details[i], subcate = subcat_temp, text = text_temp, tanscript = sentence_temp)
    TED_full <- rbind(TED_full, TED_temp)
  }
    
}

TED_full$tanscript <- trim_ws(TED_full$tanscript)

# Our final table = TED_full consisting of 1471 instances

kable(TED_full[10,], caption = "The example of  TED_full table") %>%
  kable_paper() %>%
  kableExtra::scroll_box(width = "100%", height = "200px")
The example of TED_full table
posted cate like view subcate text tanscript
10 2014-12-01 1 80000 2693800 1.3 1.3.4 In fact, deep learning has done more than that. Complex, nuanced sentences like this one are now understandable with deep learning algorithms. As you can see here, this Stanford-based system showing the red dot at the top has figured out that this sentence is expressing negative sentiment. Deep learning now in fact is near human performance at understanding what sentences are about and what it is saying about those things. Also, deep learning has been used to read Chinese, again at about native Chinese speaker level. This algorithm developed out of Switzerland by people, none of whom speak or understand any Chinese. As I say, using deep learning is about the best system in the world for this, even compared to native human understanding. : This is a system that we put together at my company which shows putting all this stuff together. These are pictures which have no text attached, and as I’m typing in here sentences, in real time it’s understanding these pictures and figuring out what they’re about and finding pictures that are similar to the text that I’m writing. So you can see, it’s actually understanding my sentences and actually understanding these pictures. I know that you’ve seen something like this on Google, where you can type in things and it will show you pictures, but actually what it’s doing is it’s searching the webpage for the text. This is very different from actually understanding the images. This is something that computers have only been able to do for the first time in the last few months. : footnotefootnoteSo we can see now that computers can not only see but they can also read, and, of course, we’ve shown that they can understand what they hear. Perhaps not surprising now that I’m going to tell you they can write. Here is some text that I generated using a deep learning algorithm yesterday. And here is some text that an algorithm out of Stanford generated. Each of these sentences was generated by a deep learning algorithm to describe each of those pictures. This algorithm before has never seen a man in a black shirt playing a guitar. It’s seen a man before, it’s seen black before, it’s seen a guitar before, but it has independently generated this novel description of this picture. We’re still not quite at human performance here, but we’re close. In tests, humans prefer the computer-generated caption one out of four times.

3.2 Tokenization

We tokenize our transcript by the quanteda package aiming to receive Document-Term Matrix and TFIDF matrix. In this section, we perform the tokenization twice. First, we tokenize TED, which consists of 286 videos/observations, to gain access into hidden insights each video and to observe the similarity and dissimilarity of each video. Second, we tokenize TED_full, which consists of 1,471 instances, for unsupervised and supervised learning analyses.

3.2.1 Tokenization from TED

We apply corpus() and tokens() functions to the transcript variable to remove numbers, all characters in the “punctuation”, symbols, and separators. We then remove stop words from the SMART information retrieval system in English (571 words) and also delete 2 more words, applaud and laughter, that they appear often in our transcript as sound representation. Sound representation in a transcript is one of the translated functionality of TED meant to enable deaf and hard-of-hearing viewers to understand all the non-spoken auditory information. Afterward, we perform lemmatization and name the data frame as TED.tk1.

To obtain the Document-Term Matrix (DTM) and the TFIDF matrix, we use dfm() and dfm_tfidf() functions, respectively. The first 10 terms and 10 documents (videos) are shown below.

# Quanteda
TED.cp1 <- corpus(TED$tanscript)
#summary(TED.cp1)

TED.tk1 <- tokens(
  TED.cp1, 
  remove_numbers = TRUE, 
  remove_punct = TRUE, 
  remove_symbols = TRUE, 
  remove_separators = TRUE)

TED.tk1 <- TED.tk1 %>% 
  tokens_tolower() %>% 
  tokens_remove(c(stopwords(source = "smart"), "applaud", "laughter"))

TED.tk1 <- tokens_replace(
  TED.tk1,
  pattern = hash_lemmas$token, 
  replacement = hash_lemmas$lemma)

TED.dfm1 <- dfm(TED.tk1)
kable(TED.dfm1[1:10,1:10], caption = "The example of Document-Term Matrix") %>%
  kable_paper() %>%
  kableExtra::scroll_box(width = "100%", height = "200px")
The example of Document-Term Matrix
doc_id today artificial intelligence help doctor diagnose patient pilot fly commercial
text1 1 3 2 1 6 4 11 1 1 1
text2 0 2 2 0 0 0 0 0 0 0
text3 1 0 0 0 2 0 0 0 0 1
text4 0 1 1 0 0 0 0 0 0 0
text5 1 0 0 0 0 0 0 0 2 0
text6 3 12 10 0 2 1 0 1 2 0
text7 3 3 7 4 1 1 0 0 0 0
text8 1 8 10 0 0 0 0 0 0 0
text9 2 2 2 5 0 1 0 0 0 0
text10 3 2 2 0 0 0 0 0 0 0
TED.tfidf1 <- dfm_tfidf(TED.dfm1)  
kable(TED.tfidf1[1:10,1:10], caption = "The example of TFIDF matrix") %>%
  kable_paper() %>%
  kableExtra::scroll_box(width = "100%", height = "200px")
The example of TFIDF matrix
doc_id today artificial intelligence help doctor diagnose patient pilot fly commercial
text1 0.2716746 1.8525508 1.1980671 0.4520447 5.0614931 4.378553 10.61505 1.225917 0.8129134 1.280275
text2 0.0000000 1.2350339 1.1980671 0.0000000 0.0000000 0.000000 0.00000 0.000000 0.0000000 0.000000
text3 0.2716746 0.0000000 0.0000000 0.0000000 1.6871644 0.000000 0.00000 0.000000 0.0000000 1.280275
text4 0.0000000 0.6175169 0.5990335 0.0000000 0.0000000 0.000000 0.00000 0.000000 0.0000000 0.000000
text5 0.2716746 0.0000000 0.0000000 0.0000000 0.0000000 0.000000 0.00000 0.000000 1.6258267 0.000000
text6 0.8150238 7.4102033 5.9903354 0.0000000 1.6871644 1.094638 0.00000 1.225917 1.6258267 0.000000
text7 0.8150238 1.8525508 4.1932348 1.8081786 0.8435822 1.094638 0.00000 0.000000 0.0000000 0.000000
text8 0.2716746 4.9401355 5.9903354 0.0000000 0.0000000 0.000000 0.00000 0.000000 0.0000000 0.000000
text9 0.5433492 1.2350339 1.1980671 2.2602233 0.0000000 1.094638 0.00000 0.000000 0.0000000 0.000000
text10 0.8150238 1.2350339 1.1980671 0.0000000 0.0000000 0.000000 0.00000 0.000000 0.0000000 0.000000


Additionally, the frequencies of terms can simply be obtained using textstat_frequency(). The terms are ranked by their frequency (rank = 1 for the most frequent), then plotted versus its rank as shown below to illustrate Zipf’s law. According to the Zipf’s law, the frequency that a word appears is inversely proportional to its rank, we notice that the pattern of the relationships of frequency and rank follows Zipf’s law.

We then present the scatter plot on a log10-log10 scale. Although we expect to see a linear relationship from this plot, we observe some deviation from Zipf’s law on the right hand side of the chart (high ranks/ low frequency words). The reason of the deviation might be that the text we are analyzing is not a representative sample of the language. For example, the text might contain a lot of technical terms. In other words, there are many specific terms.

TED.freq1 <- textstat_frequency(TED.dfm1)
#head(TED.freq1, 10)

zipf_orig<- ggplot(TED.freq1,
       aes(x = rank, y = frequency, label = feature)) + 
  geom_point() + 
  geom_text_repel() +
  ggtitle("The relationship of frequency and rank")

zipf_log <- ggplot(TED.freq1,
       aes(x = rank, y = frequency, label = feature)) + 
  geom_point() + 
  geom_text_repel() +
  scale_x_log10() +
  scale_y_log10() +
  ggtitle("The relationship of frequency and rank on log10-log10 scale") 

(zipf_orig+zipf_log)+
  plot_layout(guides = "collect") 

3.2.2 Tokenization from TED_full for unsupervised and supervised learning analyses

In this section, we repeat the same steps of the tokenization as previous section. Afterwards, we store the table and name it as TED.tk. We also present the first 10 terms and 10 documents of the Document-Term Matrix and the TFIDF matrix shown below.

# Quanteda
TED.cp <- corpus(TED_full$tanscript)
#summary(TED.cp)

TED.tk <- tokens(
  TED.cp, 
  remove_numbers = TRUE, 
  remove_punct = TRUE, 
  remove_symbols = TRUE, 
  remove_separators = TRUE)

TED.tk <- TED.tk %>% 
  tokens_tolower() %>% 
  tokens_remove(c(stopwords(source = "smart"), "applaud", "laughter"))

TED.tk <- tokens_replace(
  TED.tk,
  pattern = hash_lemmas$token, 
  replacement = hash_lemmas$lemma)

TED.dfm <- dfm(TED.tk)
kable(TED.dfm[1:10,1:10], caption = "The example of Document-Term Matrix") %>%
  kable_paper() %>%
  kableExtra::scroll_box(width = "100%", height = "200px")
The example of Document-Term Matrix
doc_id today artificial intelligence help doctor diagnose patient pilot fly commercial
text1 1 2 2 1 6 4 8 1 1 1
text2 0 1 0 0 0 0 3 0 0 0
text3 0 2 2 0 0 0 0 0 0 0
text4 0 0 0 0 0 0 0 0 0 0
text5 0 0 0 0 0 0 0 0 0 0
text6 0 0 0 0 0 0 0 0 0 0
text7 1 0 0 0 0 0 0 0 0 1
text8 0 0 0 0 0 0 0 0 0 0
text9 0 0 0 0 0 0 0 0 0 0
text10 0 0 0 0 0 0 0 0 0 0
TED.tfidf <- dfm_tfidf(TED.dfm)  
kable(TED.tfidf[1:10,1:10], caption = "The example of TFIDF matrix") %>%
  kable_paper() %>%
  kableExtra::scroll_box(width = "100%", height = "200px")
The example of TFIDF matrix
doc_id today artificial intelligence help doctor diagnose patient pilot fly commercial
text1 0.7204546 2.169655 1.943426 1.057023 8.380564 6.944996 11.890971 1.91234 1.389461 1.991521
text2 0.0000000 1.084827 0.000000 0.000000 0.000000 0.000000 4.459114 0.00000 0.000000 0.000000
text3 0.0000000 2.169655 1.943426 0.000000 0.000000 0.000000 0.000000 0.00000 0.000000 0.000000
text4 0.0000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.00000 0.000000 0.000000
text5 0.0000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.00000 0.000000 0.000000
text6 0.0000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.00000 0.000000 0.000000
text7 0.7204546 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.00000 0.000000 1.991521
text8 0.0000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.00000 0.000000 0.000000
text9 0.0000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.00000 0.000000 0.000000
text10 0.0000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.00000 0.000000 0.000000

4. Exploratory Data Analysis

In this section, we perform initial investigations on TED (transcript each video) to discover pattern and to spot anomalies with the summary statistics and graphical representations. We would like to present the analysis of the word frequency, the comparison of videos in term of lexical diversity, the comparison of videos in term of keyness, and the connections between terms by computing co-occurrence.

4.1 Analysis of the word frequency (plot of frequencies and TF-IDF)

TED.freq1 %>% 
  top_n(20, frequency) %>%
  ggplot(aes(
    x = reorder(feature, frequency),
    y = frequency)) + 
  geom_bar(stat = "identity") +
  coord_flip() + #change x y axis
  ylab("Frequency") + 
  xlab("term") +
  ggtitle("The top 20 most frequent terms")

textplot_wordcloud(TED.dfm1)

From the top 20 most frequent terms chart and wordcloud plot, we can see that the top 5 terms, which are “people”, “make”, “thing”, “time”, and “year”, are common terms. Due to 3 different topics, we do not expect to see topic specific terms in the top ranks of the most frequent used terms chart. However, we notice some terms that are related to our topics such as world, “love”, “ai”, and “kind”.

We can associate the different texts with their most frequent terms. For example, text12 with “people”, “em”, and “ca” and “people” was expected since it is a common term, “em” and “ca” look more specific to this text.

Subsequently, we also would like to investigate the highest TF-IDF terms to observe the specific terms per document (video). Therefore, we present the top 10 highest TF-IDF in documents as shown below. For text12, we notice that “people” does not appear in the text12 anymore as expected. This is because the TF-IDF of “people” is very low since it is not specific to any text. On the other hand, em and ca are specific terms for text12. We then have a closer look to the text12 and discover the dialog between Elon Musk and Chris Anderson, so em and ca stand for Elon Musk and Chris Anderson, respectively.

TED.tfidf1 %>% 
  tidy() %>% 
  top_n(5, count) %>% #may change to top 10
  ggplot(aes(x = term, y = count)) + 
  geom_col() + 
  coord_flip() + 
  theme(axis.text.y = element_text(size = 4),
        axis.ticks.y = element_blank())  + 
  facet_wrap(~document, ncol = 2)

To have an overall view of the terms with at least one large TF-IDF, we compute the max of the TF-IDF over all texts, for each term and present the top 20 highest TF-IDF. Regarding the below charts, we see that the terms em and regret have the largest weighted frequency, in the sense that the TF-IDF is large in at least one document. em and regret have TF-IDFs of over 200.

#sort(apply(TED.tfidf1, 2, max), decreasing = TRUE)[1:10]

TED.tfidf1 %>% 
  tidy() %>%
  group_by(term) %>%
  summarize(count = max(count)) %>% #use summarize to find max
  ungroup() %>% 
  arrange(desc(count)) %>%
  top_n(20, count) %>%
  ggplot(aes(x=reorder(term, count),
             y = count)) + 
  geom_bar(stat = "identity") + 
  coord_flip() +
  xlab("Max TF-IDF") + 
  ylab("term")

4.2 Comparison of videos in term of Lexical Diversity

We perform lexical diversity analysis by presenting the Type-Token Ratio or TTR. The function textstat_lexdiv() from the quanteda.textstats package is used in the computation.

As we can see from the below chart and tables, the lexical diversity analysis is conclusive for these data. There are some texts which have very high TTR (approximately more than 0.8) meaning that those texts have richness of the vocabulary used in the texts. Then, the TTRs gradually decrease to the lowest TTR (approximately 0.3).

From the below tables, text237 and text131 have the highest richness of vocabulary among the other texts (videos), their TTRs are 0.816 and 0.802, respectively, while text88 has the lowest TTR (0.3306) meaning that text88 has the lowest diversity of the vocabulary used in the text compared to the other texts(videos) in this corpus(sample).

ttr_top <- TED.dfm1 %>% textstat_lexdiv() %>% arrange(desc(TTR)) %>% top_n(10, TTR)
ttr_bottom <- TED.dfm1 %>% textstat_lexdiv() %>% arrange(desc(TTR)) %>% top_n(-10, TTR)

TED.dfm1 %>% textstat_lexdiv() %>% 
  ggplot(aes(reorder(document, -TTR),
             TTR)) + 
  geom_bar(stat="identity") +
  xlab("Text") +
  ggtitle("Type-Token Ratio per text")

kable(cbind(ttr_top,ttr_bottom), caption = "Text with highest and lowest TTR", caption.above = TRUE) %>%
  kable_paper() %>%  
  kableExtra::scroll_box(width = "100%", height = "200px")
Text with highest and lowest TTR
document TTR document TTR
text237 0.8160000 text240 0.3982642
text131 0.8024691 text3 0.3881090
text103 0.7861111 text44 0.3844857
text106 0.7833333 text137 0.3824734
text199 0.7745902 text47 0.3813559
text82 0.7632509 text271 0.3705882
text124 0.7577640 text125 0.3585291
text136 0.7575758 text119 0.3420943
text128 0.7538462 text12 0.3357307
text236 0.7337662 text88 0.3306496

5. Sentiment analysis

In this section, we employ two dictionaries, AFINN and NRC, and the Valence-Shifters method to conduct sentiment analysis on the transcripts of each video, which means we do not split the transcript by every 20 sentences. It might be better to see if the sentiment of video would influence the other features. Here, we have following hypothesis:

  • The sentiment of videos are related to the corresponding topic. For example, AI topic may have more positive sentiment.
  • The sentiment of videos influences the number of likes. We expect positive sentiment videos have more likes.
  • The sentiment of videos has changed over years, since it might be related to the corresponding affairs.
# resume cate, better to interpret 
TED_sentiment$cate <- gsub("1","AI",TED_sentiment$cate)
TED_sentiment$cate <- gsub("2","Climate change",TED_sentiment$cate)
TED_sentiment$cate <- gsub("3","Relationships",TED_sentiment$cate)

# since sentiment analysis cannot use cleaned data after stemming, so here use another way to tokenize again
TED.tok <- unnest_tokens(
  TED_sentiment,
  output = "word",
  input = "tanscript",
  to_lower = TRUE,
  strip_punct = TRUE,
  strip_numeric = TRUE)
TED.tok <- TED.tok %>% filter(word != "laughter" | word != "applaud")

5.1 Sentiment Based

First, we apply the NRC method to determine the sentiment of the transcripts of each video. As this method is based on sentiment analysis, we focus on examining the relationship between the topics of the videos, as well as the number of likes received by the videos.

# NRC
# join the corresponding sentiment qualifier in “nrc” 

TED.sent.nrc <- 
  inner_join(
    TED.tok,
    get_sentiments("nrc"),
    by = c("word" = "word"))

head(TED.sent.nrc, 5) %>% flextable() %>% autofit()

5.1.1 Sentiment v.s. Likes

Since there are nearly 300 transcripts (videos), we would like to extract 20 videos with the most likes and the least likes, respectively.

In this part, we apply the NRC method in two different ways, one without scaling and another with re-scaling the sentiment by their length in the documents.

# Sub data for checking Video likes topic
TED.nrc <- TED.sent.nrc %>% 
  group_by(title,cate,likes,sentiment) %>% summarise(n=n())

# too many text, hard to read
# extract top 20, tail 20 transcipt to check their sentiment
toplike20 <- TED.nrc[order(TED.nrc$likes,decreasing = T),][1:200,]
taillike20 <- TED.nrc[order(TED.nrc$likes,decreasing = F),][1:200,]

# top
toplike20%>%
  ggplot(mapping = aes(x = sentiment, y=n, fill = sentiment)) + 
  geom_bar(stat = "identity",
           alpha = 0.8) + 
  facet_wrap(~ title) + 
  coord_flip()+
  theme(legend.position = 'bottom')+
  labs(y="the number of sentiment")+
  ggtitle("The sentiment of 20 videos with most likes")

# tail
taillike20%>%
  ggplot(mapping = aes(x = sentiment, y=n, fill = sentiment)) + 
  geom_bar(stat = "identity",
           alpha = 0.8) + 
  facet_wrap(~ title) + 
  coord_flip()+
  theme(legend.position = 'bottom')+
  labs(y="the number of sentiment")+
  ggtitle("The sentiment of 20 videos with least likes")

# the frequencies of sentiments are computed, by document
TED.sent.nrc.total <- TED.sent.nrc %>% 
  group_by(title,likes) %>% 
  summarize(Total = n()) %>% 
  ungroup()

#top
left_join(
  TED.sent.nrc,
  TED.sent.nrc.total)%>% 
  filter(title %in% toplike20$title) %>%
  group_by(title, sentiment) %>%  
  summarize(n = n(),
            Total = unique(Total)) %>%
  ungroup() %>% 
  mutate(relfreq = n / Total) %>%
  ggplot(aes(
    x = sentiment,
    y = relfreq,
    fill = sentiment)) + 
  geom_bar(stat = "identity", alpha = 0.8) + 
  facet_wrap(~ title) + 
  coord_flip()+
  theme(legend.position = 'bottom')+
  labs(y="the number of sentiment")+
  ggtitle("The sentiment of 20 videos with most likes (Re-scale sentiment by their length)")

#tail
left_join(
  TED.sent.nrc,
  TED.sent.nrc.total)%>% 
  filter(title %in% taillike20$title) %>%
  group_by(title, sentiment) %>%  
  summarize(n = n(),
            Total = unique(Total)) %>%
  ungroup() %>% 
  mutate(relfreq = n / Total) %>%
  ggplot(aes(
    x = sentiment,
    y = relfreq,
    fill = sentiment)) + 
  geom_bar(stat = "identity", alpha = 0.8) + 
  facet_wrap(~ title) + 
  coord_flip()+
  theme(legend.position = 'bottom')+
  labs(y="the number of sentiment")+
  ggtitle("The sentiment of 20 videos with least likes (Re-scale sentiment by their length)")

We do not observe any significant differences in the distribution of sentiments, such as positive and anticipation, between videos with high and low numbers of likes. In fact, both positive and anticipation sentiments present across all videos, with some videos in the top 20 also exhibiting relatively high levels of negative and fear sentiments.

5.1.2 Sentiment v.s. Topics

In order to examine the frequency of sentiment in different topics, We can then compare the results to determine which sentiments are more prevalent in each topic. For example, if we analyze the Climate Change topic, we expect to see a higher frequency of negative or fear-related sentiments, due to the potentially catastrophic consequences of climate change. On the other hand, if we analyze the AI topic, we expect to see a higher frequency of anticipation or positive sentiments, as AI has the potential to bring about many benefits and advancements.

# it is hard to check the sentiment for each video, then check it for each cate

TED.nrc %>% 
  group_by(cate,sentiment) %>%
  summarise(cate_n = sum(n)) %>%
  ggplot(mapping = aes(subgroup = cate, fill = interaction(sentiment, cate), area = cate_n)) +
  geom_treemap(color="white", size=0.5*.pt, alpha=NA) +
  geom_treemap_subgroup_text(
    place = "center", alpha = 0.5, grow = TRUE) + 
  geom_treemap_text(mapping = aes(
    label = sentiment), 
    color = "white",
    place = "center", grow = FALSE) +
  guides(fill = FALSE)

As expected, the topic of AI is often accompanied by positive and anticipation, and we could not ignore trust. However, we can see that negative also accounts for a sizable part. Contradict to our speculation, the topic of climate change has the same positive sentiment which is also the most frequent part in this topic. Moreover, each sentiment is more evenly distributed in the videos on the topic of relationships, even though the positive sentiment is still the largest.

In this case, we begin to assume that positive sentiment actually is the main sentiment in TED talk showing in all videos, based on previous analysis.

5.2 Value-Based

Besides the initial assumption, we would like to check one more assumption whether the positive sentiment appears in all videos, by using the value-based method: Afinn.

# Afinn

TED.sent.afinn <- 
  inner_join(
    TED.tok,
    get_sentiments("afinn"),
    by = c("word" = "word"))
TED.sent.afinn %>% 
  group_by(title,cate) %>% 
  summarize(Score = mean(value)) %>% 
  ungroup() %>% 
  ggplot(aes(x = reorder_within(title, Score,cate), y = Score, fill = cate)) + 
  geom_bar(stat = "identity") + 
  coord_flip() +
  ylab("Mean Sentiment Score") +
  xlab("")

Here, we calculate the average sentiment score per video. We notice that most videos have positive scores. Thus, TED talk do prefer giving positive videos. Additionally, from the topics perspective, it is difficult to distinguish topics from the sentiment scores.

#extract the top and tail videos
video_sentiscore <- TED.sent.afinn %>% 
  group_by(title) %>% 
  summarize(Score = mean(value))

5.2.1 Sentiment v.s. Likes

TED.sent.afinn.like <- TED.sent.afinn %>% 
  group_by(title,cate,likes) %>% 
  summarize(Score = mean(value))

TED.sent.afinn.like %>% ggplot(aes(x=likes,y=Score,color = cate))+
  geom_point(size=1) +
  geom_smooth(method = "lm")+
  facet_wrap(~cate,scales = 'free')+
  theme(legend.position = 'bottom')

We separate each category to observe the distribution of the number of likes and sentiment values. We can observe that there is no obvious pattern. Compared to AI and Relationship, the sentiment of Climate change topic is widely distributed.

5.2.2 Sentiment v.s. topics

TED.sent.afinn.cate <- TED.sent.afinn %>% 
  group_by(title,cate) %>% 
  summarize(Score = mean(value))

TED.sent.afinn.cate %>% 
  ggplot(mapping = aes(x = cate, y = Score))+
  geom_boxplot()+
  labs(x="Topics")

The sentiment values for each topic are relatively similar, and they are all in the upper-middle range (more positive). We also sport two outliers with large negative from the AI topic.

5.2.3 Sentiment over years

TED.sent.afinn.year <- TED.sent.afinn 

TED.sent.afinn.year <- TED.sent.afinn.year %>% 
  group_by(title,cate,posted) %>% 
  summarize(Score = mean(value))

TED.sent.afinn.year %>% ggplot(aes(x=posted,y=Score))+
  geom_point()+
  geom_smooth(method = "lm")+
  theme(legend.position = 'bottom')+
  labs(x="Posted Year")+
  facet_wrap(~cate)+
  scale_x_date(date_minor_breaks = "2 day")

Overall, the trend of sentiment value is slightly decreasing, but there is no clear correlation between sentiment value and year.

5.3 Using Valence-Shifters

In this section, we would like to check if the results would change after using valence-shifters.

## split by sentences
TED_sentiment_text <- get_sentences(TED_sentiment$tanscript)
## Compute the sentiment by sentences
TED.senti <- sentiment(TED_sentiment_text)
## Prepare a tibble for the plot
TED.senti <- as_tibble(TED.senti)

TED.sentdoc <- sentiment_by(TED_sentiment$tanscript)

TED.sentdoc %>% 
  mutate(Document = factor(paste("Doc_", element_id, sep = ""))) %>% 
  ggplot(aes(x = reorder(Document, ave_sentiment),
             y = ave_sentiment)) + 
  geom_bar(stat="identity") + 
  coord_flip() +
  xlab("") +
  ylab("Average Sentiment Score")

#check the difference between afinn method and Valence-Shifters
#check the number of doc with score < 0 

nagative_VF <- sum((TED.sentdoc$ave_sentiment <0) == T)

negative <- TED.sent.afinn %>% 
  group_by(title) %>% 
  summarize(Score = mean(value)) %>%
  filter(Score < 0 )

nagative <- length(negative$title)

# [1] 19
# [1] 31

We conclude that the sentiment values are distributed as similar as the one without using Valence-Shifters. After counting the number of videos transcripts with negative values, we found that there are 31 videos having negative values before applying Valence-Shifters, and 19 videos having negative values after considering Valence-Shifters.

6. Topic modeling

Topic modeling is a method for discovering the latent themes or topics that exist within a collection of documents. Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are two popular techniques for topic modeling.

6.1 LSA

6.1.1 LSA on TF

First, we build the LSA object with 4 dimensions. Latent Semantic Analysis(LSA) decomposes the DTM (TED.dfm) into 3 matrices (\(M = U\Sigma V^{t}\)), centred around 4 topics. We examine the 3 matrices: U:Doc-topic sim, Σ:Topic strength and V:Terms-topic sim.

The Doc-topic sim table below shows the link between each text and each category. For example, text1 is most relevant to dimension 2.

TED.lsa <- textmodel_lsa(x = TED.dfm,nd = 4)
kable(TED.lsa$docs, 
      col.names = c("dimension1","dimension2","dimension3","dimension4"),
      caption = "Doc-topic sim.(LSA on TF)") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
Doc-topic sim.(LSA on TF)
dimension1 dimension2 dimension3 dimension4
text1 -0.0279833 0.0561772 -0.0062328 0.0010815
text2 -0.0215096 0.0463497 -0.0014294 -0.0103448
text3 -0.0307021 0.0859480 0.0117433 -0.0083037
text4 -0.0380724 0.1135957 0.0104629 -0.0307621
text5 -0.0356207 0.0885008 0.0106824 -0.0660644
text6 -0.0246441 0.0817765 0.0108066 -0.0532954
text7 -0.0338506 0.0583223 0.0036653 0.0059932
text8 -0.0370382 0.0411531 0.0103976 0.0116836
text9 -0.0447897 0.0265595 0.0185700 -0.0080117
text10 -0.0338763 0.0454285 0.0153195 0.0024713
text11 -0.0423486 0.0567818 -0.0033229 -0.0016736
text12 -0.0379895 0.0574517 0.0136700 0.0113588
text13 -0.0288484 0.0045382 -0.0153662 -0.0019589
text14 -0.0406856 0.0217827 0.0084525 0.0156910
text15 -0.0002734 -0.0002192 0.0000961 0.0000464
text16 -0.0365886 0.0122418 0.0176984 0.0030070
text17 -0.0311208 -0.0028264 -0.0026220 0.0048595
text18 -0.0492976 -0.0351071 0.0427234 -0.0208907
text19 -0.0363120 -0.0152421 0.0234045 -0.0237751
text20 -0.0339046 -0.0038624 0.0173571 -0.0082206
text21 -0.0208069 0.0077222 -0.0139165 0.0000852
text22 -0.0222323 0.0278802 -0.0051485 0.0079200
text23 -0.0297813 0.0321456 -0.0013286 0.0138956
text24 -0.0374560 0.0496412 0.0096167 0.0257162
text25 -0.0557391 0.0888191 0.0219965 0.0710129
text26 -0.0521989 0.0306777 -0.0003609 0.0141619
text27 -0.0042544 0.0008642 0.0006344 0.0036206
text28 -0.0251029 0.0411651 -0.0072890 -0.0124747
text29 -0.0344050 0.0589457 0.0114062 -0.0139354
text30 -0.0369580 0.0318083 -0.0254685 0.0025975
text31 -0.0226044 0.0485518 0.0056026 0.0161761
text32 -0.0356187 0.0534748 0.0104151 -0.0168302
text33 -0.0080838 0.0072980 -0.0092405 -0.0085102
text34 -0.0352532 0.0656681 0.0181894 -0.0454033
text35 -0.0242353 0.0542360 -0.0041722 -0.0215638
text36 -0.0330215 0.0461389 0.0104417 -0.0272344
text37 -0.0308500 0.0598555 -0.0063604 -0.0430018
text38 -0.0179821 0.0097366 0.0049577 0.0050480
text39 -0.0238330 0.0061130 0.0053042 0.0052143
text40 -0.0274934 0.0193347 0.0062537 -0.0169289
text41 -0.0315560 0.0155617 0.0168134 -0.0053682
text42 -0.0234046 -0.0124838 0.0248677 -0.0152512
text43 -0.0179985 -0.0099123 0.0128111 0.0014105
text44 -0.0261786 -0.0016035 -0.0003247 -0.0083328
text45 -0.0287634 0.0188651 -0.0088042 -0.0123455
text46 -0.0029347 0.0044119 -0.0022362 -0.0012734
text47 -0.0552940 0.0733096 0.0024355 -0.0509485
text48 -0.0358880 0.0440969 -0.0276464 -0.0150295
text49 -0.0521970 0.0006805 0.0569793 -0.0663707
text50 -0.0317971 0.0670473 0.0153916 -0.0577716
text51 -0.0206433 -0.0126664 0.0078986 -0.0133199
text52 -0.0205440 -0.0224562 0.0134115 -0.0112309
text53 -0.0246325 -0.0093468 0.0051426 0.0010782
text54 -0.0372347 0.0492480 -0.0246981 -0.0280275
text55 -0.0179237 0.0050522 -0.0064212 -0.0046819
text56 -0.0340575 -0.0007286 0.0000155 0.0004383
text57 -0.0002800 -0.0002177 0.0000933 0.0000443
text58 -0.0387312 0.0139075 0.0124944 -0.0249154
text59 -0.0187707 -0.0058363 0.0048127 -0.0145163
text60 -0.0304345 -0.0145236 -0.0096096 0.0052979
text61 -0.0274175 -0.0111679 -0.0359774 0.0146917
text62 -0.0336390 -0.0307688 -0.0397434 0.0019181
text63 -0.0365880 -0.0182064 -0.0274250 0.0199141
text64 -0.0367709 0.0240360 0.0088370 -0.0023384
text65 -0.0369276 0.0098701 -0.0100740 0.0136045
text66 -0.0380612 -0.0003095 -0.0075114 -0.0003216
text67 -0.0438540 0.0388025 0.0346646 0.1135046
text68 -0.0436554 -0.0029038 0.0384375 0.0593777
text69 -0.0432863 0.0089356 -0.0104822 0.0353299
text70 -0.0230926 0.0254432 0.0042603 0.0043933
text71 -0.0363061 0.0065540 0.0182818 -0.0013600
text72 -0.0253895 0.0186895 0.0178868 -0.0008391
text73 -0.0222579 -0.0043708 0.0067340 0.0023973
text74 -0.0225799 -0.0086746 -0.0271082 0.0152771
text75 -0.0365370 -0.0004380 -0.0128821 0.0025697
text76 -0.0390514 -0.0201863 0.0121366 -0.0127377
text77 -0.0483404 -0.0368092 0.0285262 -0.0121317
text78 -0.0220029 -0.0159129 -0.0014702 0.0035078
text79 -0.0380605 -0.0097169 -0.0031074 0.0115252
text80 -0.0370130 -0.0109523 -0.0137154 0.0037824
text81 -0.0307334 -0.0165525 -0.0143823 -0.0040249
text82 -0.0296924 -0.0104191 0.0110662 -0.0204233
text83 -0.0123499 -0.0042439 -0.0082059 -0.0008252
text84 -0.0273602 0.0017589 0.0222625 0.0105088
text85 -0.0333002 -0.0073377 0.0280700 -0.0012174
text86 -0.0405971 0.0108488 -0.0011507 -0.0049024
text87 -0.0261048 0.0297196 -0.0006283 -0.0087251
text88 -0.0017894 -0.0008902 0.0018614 0.0020646
text89 -0.0238139 0.0268947 0.0072796 -0.0062685
text90 -0.0213756 -0.0179840 0.0233519 -0.0104660
text91 -0.0276659 0.0208409 0.0040739 0.0038001
text92 -0.0223419 -0.0000700 0.0036568 0.0113741
text93 -0.0079573 0.0009448 0.0024572 0.0020406
text94 -0.0136609 -0.0002756 0.0067663 0.0062770
text95 -0.0188602 -0.0134104 0.0149755 0.0026671
text96 -0.0158844 0.0192038 0.0161773 -0.0091170
text97 -0.0303079 0.0653626 0.0169528 -0.0161569
text98 -0.0306567 0.0496087 0.0213858 -0.0139761
text99 -0.0349690 0.0859240 0.0122481 -0.0123064
text100 -0.0101329 0.0205964 0.0005916 0.0030593
text101 -0.0252831 0.0151071 0.0049409 0.0474391
text102 -0.0288951 0.0422386 0.0167159 0.0648452
text103 -0.0215067 0.0316238 0.0151059 0.0815688
text104 -0.0334646 0.0232907 -0.0029621 0.0044123
text105 -0.0552199 0.1785669 -0.0013696 -0.1143071
text106 -0.0334138 0.0186723 -0.0051097 -0.0132088
text107 -0.0456812 0.0938689 0.0025930 -0.0819867
text108 -0.0404497 0.0628996 0.0006520 -0.0458758
text109 -0.0032467 -0.0001189 0.0007714 -0.0007521
text110 -0.0451234 -0.0521649 0.0408087 -0.0342268
text111 -0.0429856 0.0069827 -0.0050359 0.0193886
text112 -0.0284270 0.0068459 0.0068902 0.0108043
text113 -0.0250141 0.0191274 -0.0083502 -0.0039554
text114 -0.0324672 -0.0039064 0.0106203 0.0090367
text115 -0.0229685 0.0157744 -0.0107678 -0.0027180
text116 -0.0124274 0.0093274 -0.0024347 -0.0083142
text117 -0.0112999 -0.0077356 0.0044358 0.0024031
text118 -0.0299285 -0.0020076 0.0090506 -0.0032034
text119 -0.0228151 -0.0041759 0.0058362 0.0050531
text120 -0.0152173 -0.0029941 0.0077489 0.0028596
text121 -0.0504794 0.0488037 0.0010442 -0.0118251
text122 -0.0326946 0.0046746 0.0202754 -0.0318908
text123 -0.0237463 0.0374208 -0.0036084 -0.0113747
text124 -0.0447069 0.0468236 -0.0142640 -0.0138476
text125 -0.0327403 0.0260297 -0.0150693 0.0005384
text126 -0.0080820 0.0142708 0.0028892 -0.0054574
text127 -0.0262351 0.0126524 -0.0179349 0.0124041
text128 -0.0080480 0.0012154 -0.0069604 0.0074640
text129 -0.0399340 0.0284301 -0.0148747 0.0304464
text130 -0.0225688 0.0309246 -0.0064541 0.0038491
text131 -0.0278286 0.0400140 -0.0096108 0.0108925
text132 -0.0335176 -0.0117386 0.0068298 0.0017164
text133 -0.0421681 0.0251075 0.0050350 0.0136960
text134 -0.0469385 0.0399387 0.0041594 -0.0037036
text135 -0.0298632 -0.0029811 -0.0047221 0.0059372
text136 -0.0397649 0.0003641 -0.0033884 0.0013429
text137 -0.0293101 0.0124232 -0.0148709 0.0321575
text138 -0.0058502 0.0036290 -0.0038724 0.0008068
text139 -0.0240327 0.0343566 -0.0024762 -0.0105157
text140 -0.0293267 0.0235632 0.0008046 -0.0108405
text141 -0.0105679 0.0128090 -0.0058995 -0.0066040
text142 -0.0182028 0.0024306 -0.0441123 -0.0028671
text143 -0.0232917 -0.0057918 -0.0224714 0.0065591
text144 -0.0281074 -0.0046934 -0.0057654 0.0096708
text145 -0.0180302 0.0052750 0.0059629 0.0070135
text146 -0.0167414 0.0068004 -0.0110631 0.0055971
text147 -0.0040667 0.0011578 -0.0025673 0.0005869
text148 -0.0218604 0.0097503 -0.0027959 0.0007421
text149 -0.0238928 0.0398287 -0.0085369 0.0107892
text150 -0.0191394 0.0152123 -0.0059450 -0.0018734
text151 -0.0377778 0.0385476 -0.0055327 -0.0113984
text152 -0.0272636 0.0058123 -0.0095221 0.0023979
text153 -0.0259141 0.0376867 -0.0076300 -0.0142960
text154 -0.0197460 -0.0066844 -0.0222594 0.0077663
text155 -0.0246546 0.0096815 0.0242707 0.0809769
text156 -0.0363223 0.0167559 0.0584669 0.1488315
text157 -0.0359745 0.0281995 0.0355989 0.1290572
text158 -0.0003100 -0.0003315 0.0004466 0.0001506
text159 -0.0141835 -0.0052222 -0.0138384 -0.0021810
text160 -0.0129338 -0.0017204 -0.0117762 0.0070835
text161 -0.0171682 0.0196943 0.0000753 -0.0075010
text162 -0.0179105 0.0431996 0.0077946 -0.0208165
text163 -0.0199926 0.0604170 -0.0009650 -0.0398460
text164 -0.0164277 0.0113398 0.0053861 -0.0022887
text165 -0.0151047 0.0069685 -0.0061232 -0.0013786
text166 -0.0147898 0.0029638 -0.0001522 0.0042710
text167 -0.0154958 -0.0227844 0.0245828 -0.0112759
text168 -0.0019561 -0.0059643 0.0083503 -0.0046207
text169 -0.0540385 0.0964672 -0.0238750 -0.0919114
text170 -0.0357382 0.1033155 -0.0203808 -0.0890914
text171 -0.0506566 0.1059461 -0.0077697 -0.0852113
text172 -0.0199858 0.0538812 -0.0075835 -0.0379010
text173 -0.0292039 0.0311755 0.0183368 0.1184753
text174 -0.0426940 0.0165574 0.0248219 0.1161570
text175 -0.0282166 0.0041431 0.0306893 0.0243437
text176 -0.0239496 0.0366245 0.0244173 0.1206897
text177 -0.0261113 0.0548723 0.0205078 0.1006257
text178 -0.0118952 0.0198276 0.0068513 0.0213746
text179 -0.0730337 0.0754465 0.0305581 0.0106583
text180 -0.0394864 0.0169127 0.0065107 0.0162286
text181 -0.0211651 -0.0106377 -0.0053532 0.0016129
text182 -0.0216395 0.0253066 0.0066810 -0.0088423
text183 -0.0187809 0.0090976 0.0069218 -0.0014912
text184 -0.0287576 0.0450124 0.0095551 0.0132355
text185 -0.0288333 0.0407495 0.0063370 -0.0200690
text186 -0.0254033 0.0305906 0.0105319 0.0149104
text187 -0.0028981 -0.0001258 0.0002319 0.0024810
text188 -0.0387584 0.0595247 0.0147159 -0.0206766
text189 -0.0248883 0.0087208 0.0090503 -0.0120219
text190 -0.0308973 0.0091735 0.0142090 -0.0072949
text191 -0.0036543 0.0006069 -0.0000831 -0.0009432
text192 -0.0371389 0.0404968 -0.0057810 -0.0059049
text193 -0.0404662 0.0459959 0.0037875 -0.0064073
text194 -0.0315524 0.0379695 0.0009668 -0.0013715
text195 -0.0275106 0.0295415 0.0164099 -0.0098814
text196 -0.0495898 0.0891361 0.0011682 -0.0327189
text197 -0.0295394 0.0511662 -0.0135244 -0.0088058
text198 -0.0227541 0.0009795 -0.0099630 -0.0090466
text199 -0.0230107 -0.0147360 0.0055331 0.0113032
text200 -0.0209480 -0.0045301 0.0089428 0.0109121
text201 -0.0266815 0.0076831 0.0157233 0.0134763
text202 -0.0214656 0.0144694 0.0012992 0.0079005
text203 -0.0343508 0.0192556 0.0276279 0.0126511
text204 -0.0226707 -0.0001894 0.0165146 0.0050609
text205 -0.0213267 0.0057986 -0.0027494 -0.0032255
text206 -0.0267909 -0.0311451 0.0266560 -0.0116749
text207 -0.0296556 0.0057585 0.0131142 0.0067108
text208 -0.0230199 -0.0023935 0.0070931 -0.0007709
text209 -0.0358851 0.0913659 -0.0231707 -0.0714738
text210 -0.0363548 0.0432097 -0.0501678 -0.0497866
text211 -0.0237017 0.0378497 -0.0194374 -0.0302199
text212 -0.0267583 0.0182209 0.0041828 -0.0054826
text213 -0.0382702 0.0154236 0.0027039 0.0176053
text214 -0.0329014 0.0236960 -0.0212561 0.0029021
text215 -0.0382527 0.0457724 -0.0000462 0.0115791
text216 -0.0303664 0.0426079 0.0084288 0.0174619
text217 -0.0254046 -0.0021138 0.0003308 0.0035142
text218 -0.0236368 0.0045578 0.0069398 -0.0019974
text219 -0.0337199 0.0023399 -0.0087739 0.0092073
text220 -0.0217051 -0.0024124 -0.0073040 0.0009637
text221 -0.0279847 -0.0012839 0.0076873 -0.0149049
text222 -0.0346561 0.0099346 0.0067823 0.0056875
text223 -0.0386104 0.0297445 0.0224357 0.0351533
text224 -0.0169048 0.0142876 0.0050891 0.0119457
text225 -0.0280442 -0.0239659 0.0251730 -0.0109070
text226 -0.0320561 -0.0135980 0.0037052 0.0107657
text227 -0.0283364 -0.0129288 0.0123105 -0.0033214
text228 -0.0496937 -0.0335365 0.0235978 -0.0144798
text229 -0.0506091 -0.0235835 0.0369757 -0.0081879
text230 -0.0429816 -0.0150868 -0.0045988 -0.0101625
text231 -0.0184561 -0.0034977 0.0113900 0.0004809
text232 -0.0432677 0.0582134 0.0065466 -0.0177905
text233 -0.0249511 0.0470744 0.0087902 -0.0013826
text234 -0.0313487 0.0340880 0.0066503 -0.0071330
text235 -0.0455340 0.0237267 0.0279164 -0.0368089
text236 -0.0146419 0.0250317 -0.0016594 -0.0133737
text237 -0.0272422 0.0299257 0.0045785 -0.0013359
text238 -0.0254705 0.0455727 0.0011334 -0.0103612
text239 -0.0295186 0.0220053 0.0125004 -0.0093793
text240 -0.0249012 0.0077798 0.0044576 -0.0038404
text241 -0.0272629 0.0198924 0.0011221 -0.0095154
text242 -0.0149972 0.0002983 0.0017334 -0.0000624
text243 -0.0321835 0.0468390 -0.0075994 -0.0153850
text244 -0.0118048 0.0345115 0.0017258 -0.0131031
text245 -0.0333481 0.0054960 0.0052575 0.0144681
text246 -0.0290174 -0.0093952 -0.0113067 0.0142981
text247 -0.0219416 0.0009818 -0.0164809 0.0287223
text248 -0.0174695 0.0058309 -0.0078095 0.0187005
text249 -0.0221334 0.0050318 -0.0066573 0.0117645
text250 -0.0320514 -0.0200683 0.0064722 0.0070370
text251 -0.0187643 -0.0106505 -0.0015961 -0.0008024
text252 -0.0142930 0.0265515 0.0158235 0.1112710
text253 -0.0225281 0.0262971 0.0162358 0.1350033
text254 -0.0153129 0.0326933 0.0249880 0.1504334
text255 -0.0188525 0.0290820 0.0253043 0.1452273
text256 -0.0212003 0.0313432 0.0278886 0.1579571
text257 -0.0311833 0.0408729 0.0190014 0.1378870
text258 -0.0086249 0.0067156 0.0058596 0.0389458
text259 -0.0448198 0.0207143 0.0184524 0.0154998
text260 -0.0296013 0.0155162 0.0027490 0.0219953
text261 -0.0175754 0.0069976 -0.0008076 0.0133545
text262 -0.0110823 0.0133815 0.0019815 0.0086488
text263 -0.0370083 0.0214128 -0.0037474 0.0239250
text264 -0.0270578 0.0166270 0.0038165 0.0234996
text265 -0.0055194 0.0043829 0.0000412 0.0022743
text266 -0.0344029 0.0452598 -0.0002820 0.0055018
text267 -0.0233232 0.0278186 0.0005168 0.0006332
text268 -0.0322342 0.0324569 0.0048921 0.0063477
text269 -0.0211750 0.0431813 0.0044757 -0.0153862
text270 -0.0059574 0.0075959 0.0041181 -0.0022563
text271 -0.0368308 0.0100097 -0.0143000 0.0144517
text272 -0.0255355 -0.0076855 0.0080946 0.0018930
text273 -0.0429004 -0.0106305 -0.0237353 0.0053662
text274 -0.0367460 -0.0089718 -0.0204978 -0.0078798
text275 -0.0296445 -0.0021251 -0.0245614 0.0100436
text276 -0.0065677 -0.0023253 -0.0028482 0.0010069
text277 -0.0242471 0.0256119 -0.0044972 0.0140766
text278 -0.0333961 0.0127646 -0.0107096 0.0267039
text279 -0.0216656 0.0010919 0.0018627 0.0007851
text280 -0.0296621 0.0160163 0.0097967 0.0015650
text281 -0.0422471 0.0484331 -0.0055017 -0.0230764
text282 -0.0300455 0.0298419 -0.0088620 -0.0216759
text283 -0.0228110 0.0360030 -0.0063180 -0.0153464
text284 -0.0130195 0.0024853 -0.0001097 -0.0014354
text285 -0.0191587 0.0089330 -0.0017018 0.0004359
text286 -0.0239856 0.0028432 -0.0059862 0.0081256
text287 -0.0165906 0.0115141 -0.0128037 -0.0071357
text288 -0.0202611 0.0129806 -0.0060661 0.0003266
text289 -0.0436336 0.0014137 -0.0219298 0.0140062
text290 -0.0239085 -0.0167601 -0.0145332 0.0065670
text291 -0.0333432 0.0062700 -0.0057775 0.0040305
text292 -0.0291566 -0.0093572 -0.0059880 0.0129509
text293 -0.0386876 -0.0001248 -0.0109655 -0.0057173
text294 -0.0229080 -0.0101698 0.0038833 0.0011107
text295 -0.0341748 0.0417331 0.0084092 -0.0190869
text296 -0.0255712 0.0458123 -0.0018898 -0.0216408
text297 -0.0223298 0.0463810 0.0146305 0.0158308
text298 -0.0181942 0.0434030 0.0184473 0.0317746
text299 -0.0142553 0.0076668 0.0181290 0.0441261
text300 -0.0264124 0.0199616 0.0062114 -0.0099532
text301 -0.0313245 0.0486356 0.0330829 0.0470726
text302 -0.0041828 -0.0007017 0.0024321 0.0010852
text303 -0.0296090 0.0049336 -0.0413604 -0.0043534
text304 -0.0160931 0.0392542 -0.0028230 -0.0153412
text305 -0.0416541 0.0104279 0.0460834 0.1507425
text306 -0.0408563 0.0084444 0.0580300 0.0918964
text307 -0.0289957 0.0118483 0.0256498 0.0544986
text308 -0.0325008 0.0123361 0.0299135 0.0951028
text309 -0.0363817 0.0115965 0.0502256 0.1129092
text310 -0.0261317 -0.0148388 0.0045001 0.0032153
text311 -0.0168190 0.0108930 0.0186856 0.0407706
text312 -0.0210303 -0.0156995 0.0077920 -0.0068938
text313 -0.0347061 -0.0304997 0.0308394 -0.0079987
text314 -0.0303622 -0.0061068 0.0283298 -0.0115931
text315 -0.0391224 -0.0055718 0.0479892 0.0460296
text316 -0.0264747 -0.0150023 0.0171547 0.0106166
text317 -0.0223611 -0.0152520 0.0158392 -0.0110954
text318 -0.0297138 0.0016341 0.0053521 0.0039351
text319 -0.0153070 -0.0092990 0.0199336 0.0133619
text320 -0.0403149 0.0264136 0.0291102 0.1687871
text321 -0.0130086 0.0032251 0.0049536 0.0373890
text322 -0.0139683 0.0178539 0.0098652 0.0800696
text323 -0.0186478 0.0237833 0.0214813 0.1341683
text324 -0.0128036 0.0081222 0.0125640 0.0375931
text325 -0.0284548 0.0238757 0.0243550 0.0814224
text326 -0.0096586 0.0135602 0.0197740 0.0808877
text327 -0.0183363 0.0194645 0.0364964 0.1241415
text328 -0.0245758 0.0172555 0.0256266 0.0904241
text329 -0.0279334 0.0130789 0.0086507 0.0493063
text330 -0.0063576 0.0094594 0.0089496 0.0547508
text331 -0.0217510 -0.0092595 0.0150817 0.0079834
text332 -0.0180716 0.0114447 0.0047712 -0.0038090
text333 -0.0135423 -0.0017924 0.0019738 0.0079336
text334 -0.0116413 -0.0056308 -0.0061087 0.0079262
text335 -0.0071536 -0.0011117 -0.0031055 0.0060076
text336 -0.0099417 -0.0020191 0.0032734 0.0077292
text337 -0.0307648 0.0354987 0.0062636 -0.0089653
text338 -0.0228403 0.0133423 0.0064934 -0.0055807
text339 -0.0305754 0.0310809 0.0121326 -0.0105775
text340 -0.0356297 0.0304748 0.0087342 0.0020671
text341 -0.0239563 0.0347845 0.0030371 -0.0083612
text342 -0.0359038 0.0443493 0.0281446 0.0667528
text343 -0.0307947 0.0245602 0.0232845 0.0259830
text344 -0.0123026 0.0084650 0.0059225 0.0130563
text345 -0.0354897 0.1286508 -0.0135284 -0.0780830
text346 -0.0135051 0.0431137 0.0033974 -0.0200889
text347 -0.0325446 0.0460111 -0.0153598 -0.0439548
text348 -0.0251924 0.0685377 -0.0104860 -0.0446945
text349 -0.0344841 0.0807493 -0.0104533 -0.0513765
text350 -0.0271980 0.0873678 -0.0118493 -0.0641791
text351 -0.0234380 0.0379345 0.0127813 0.0101483
text352 -0.0157604 0.0022331 0.0138417 0.0090903
text353 -0.0330089 -0.0004663 0.0187482 0.0048078
text354 -0.0183529 0.0101495 0.0142696 -0.0038531
text355 -0.0221006 -0.0084159 0.0022004 0.0028804
text356 -0.0236923 0.0031013 0.0151404 0.0027230
text357 -0.0195220 -0.0007685 0.0086767 -0.0033572
text358 -0.0195181 -0.0161448 0.0225943 -0.0054572
text359 -0.0327372 -0.0150350 0.0166779 0.0037483
text360 -0.0203667 0.0120326 0.0104605 0.0049401
text361 -0.0200897 0.0124213 0.0023709 0.0029208
text362 -0.0211761 0.0025926 0.0175322 0.0138209
text363 -0.0348665 0.0148250 0.0097766 0.0127088
text364 -0.0266001 -0.0030573 -0.0055121 0.0024116
text365 -0.0190706 0.0023904 0.0101108 -0.0029578
text366 -0.0212273 0.0200925 0.0156495 0.0398735
text367 -0.0213545 -0.0008120 0.0074523 0.0109766
text368 -0.0264685 -0.0081237 -0.0251891 -0.0063523
text369 -0.0286666 -0.0157278 0.0104813 -0.0175061
text370 -0.0432260 -0.0016926 -0.0184843 -0.0108937
text371 -0.0260754 0.0087061 0.0184819 0.0305493
text372 -0.0425373 0.0371114 -0.0063545 -0.0127708
text373 -0.0473579 0.0376652 0.0051151 0.0322850
text374 -0.0369505 0.0116064 0.0090513 0.0160921
text375 -0.0569224 0.0568404 -0.0066457 0.0183375
text376 -0.0646091 0.0961761 -0.0118323 -0.0560647
text377 -0.0700701 0.0664771 -0.0137508 -0.0392930
text378 -0.0299172 0.0207730 0.0018152 -0.0022358
text379 -0.0282058 0.0495351 -0.0062094 0.0010556
text380 -0.0341350 0.0532999 -0.0046142 -0.0107049
text381 -0.0300670 0.0313113 -0.0034744 -0.0013894
text382 -0.0214688 0.0225820 -0.0044302 0.0009382
text383 -0.0270961 0.0088370 -0.0055345 0.0052084
text384 -0.0281556 -0.0066100 -0.0175832 0.0017005
text385 -0.0287097 -0.0151839 -0.0246721 0.0014328
text386 -0.0366388 0.0015326 -0.0002089 0.0037410
text387 -0.0306902 -0.0039006 0.0032919 -0.0040286
text388 -0.0294701 0.0444463 -0.0037696 0.0012072
text389 -0.0307372 0.0536787 -0.0026898 -0.0046930
text390 -0.0115661 0.0159137 -0.0013489 -0.0032501
text391 -0.0332552 0.0118787 -0.0122497 0.0118911
text392 -0.0392987 0.0462204 -0.0035552 0.0116284
text393 -0.0311379 -0.0218903 0.0194090 0.0043896
text394 -0.0396791 0.0091987 0.0261801 -0.0009170
text395 -0.0095760 -0.0025807 0.0022668 0.0028780
text396 -0.0268503 -0.0104450 0.0100823 -0.0070001
text397 -0.0332552 0.0118787 -0.0122497 0.0118911
text398 -0.0392987 0.0462204 -0.0035552 0.0116284
text399 -0.0405396 0.1034778 -0.0073124 -0.0888108
text400 -0.0283061 0.0478516 0.0110753 -0.0490471
text401 -0.0429285 0.0910602 0.0104452 -0.0723139
text402 -0.0366631 0.0706994 -0.0092534 -0.0669935
text403 -0.0100288 0.0308001 -0.0001222 -0.0126220
text404 -0.0405396 0.1034778 -0.0073124 -0.0888108
text405 -0.0283061 0.0478516 0.0110753 -0.0490471
text406 -0.0429285 0.0910602 0.0104452 -0.0723139
text407 -0.0366631 0.0706994 -0.0092534 -0.0669935
text408 -0.0100288 0.0308001 -0.0001222 -0.0126220
text409 -0.0337391 0.0209710 0.0086957 0.0041487
text410 -0.0346790 -0.0067287 0.0070609 0.0172185
text411 -0.0367499 0.0120897 -0.0198632 -0.0040335
text412 -0.0312820 0.0647975 0.0037296 -0.0211232
text413 -0.0266962 0.0027072 0.0012108 0.0009510
text414 -0.0195407 -0.0143951 0.0071137 -0.0052538
text415 -0.0338267 0.0086321 0.0225938 0.0492075
text416 -0.0302582 0.0530019 0.0316096 0.1017148
text417 -0.0370425 0.0305745 0.0159072 0.0609016
text418 -0.0462059 -0.0120178 0.0639893 0.0650952
text419 -0.0394478 0.0478282 0.0446952 0.1537351
text420 -0.0479703 0.0368636 0.0684207 0.2046868
text421 -0.0066810 0.0157481 0.0093707 0.0278680
text422 -0.0473852 0.0572153 -0.0231923 -0.0095048
text423 -0.0152923 0.0155680 -0.0072106 -0.0045890
text424 -0.0241674 0.0240188 0.0203648 0.0837848
text425 -0.0111251 0.0053358 -0.0006045 0.0190264
text426 -0.0107264 0.0013180 0.0058558 0.0089646
text427 -0.0046780 0.0023243 0.0028352 0.0121811
text428 -0.0238203 0.0301418 -0.0040600 0.0012652
text429 -0.0299737 0.0331940 -0.0125905 -0.0028687
text430 -0.0258066 0.0481012 -0.0137355 -0.0095295
text431 -0.0394832 0.0445706 -0.0017147 -0.0199814
text432 -0.0201531 0.0107942 -0.0117829 -0.0089198
text433 -0.0171276 0.0071585 -0.0002047 -0.0159465
text434 -0.0216889 0.0092431 -0.0001597 0.0219725
text435 -0.0073847 0.0114532 -0.0006144 0.0120542
text436 -0.0309847 -0.0015331 -0.0312250 0.0061257
text437 -0.0219487 0.0030105 -0.0244118 0.0057404
text438 -0.0088601 -0.0025539 -0.0100425 0.0017996
text439 -0.0371004 0.0483980 0.0100953 0.0396477
text440 -0.0358446 0.0590119 0.0441144 0.2252310
text441 -0.0499432 0.0614367 0.0516401 0.2260015
text442 -0.0350552 0.0726681 -0.0251659 -0.0269170
text443 -0.0194452 0.0372956 -0.0164398 -0.0122970
text444 -0.0214506 -0.0081914 0.0125764 0.0080714
text445 -0.0236850 0.0221755 0.0006917 0.0313481
text446 -0.0190810 0.0178374 0.0046222 0.0143938
text447 -0.0203430 0.0085733 -0.0114350 -0.0002909
text448 -0.0244509 0.0075367 -0.0041391 0.0054034
text449 -0.0148101 -0.0032843 0.0058191 0.0064620
text450 -0.0270224 0.0043103 0.0169224 0.0568826
text451 -0.0186529 -0.0074058 0.0088991 0.0045876
text452 -0.0171425 0.0058232 -0.0027934 0.0126512
text453 -0.0159384 -0.0022181 0.0027496 0.0131225
text454 -0.0296338 -0.0156502 0.0271330 -0.0001213
text455 -0.0158682 0.0209039 0.0120453 0.0934968
text456 -0.0186116 0.0132201 0.0136023 0.0618765
text457 -0.0233097 0.0147221 0.0084602 0.0821709
text458 -0.0196028 0.0229653 0.0139565 0.0921477
text459 -0.0154622 0.0081074 -0.0026541 0.0396147
text460 -0.0154596 0.0231746 0.0100512 0.0632254
text461 -0.0167586 0.0094447 -0.0020943 0.0585414
text462 -0.0161633 0.0010374 0.0074810 0.0231924
text463 -0.0221837 -0.0016977 0.0092307 0.0155437
text464 -0.0124711 0.0042698 0.0025042 0.0032628
text465 -0.0175301 -0.0148497 0.0051826 -0.0030310
text466 -0.0228017 -0.0017837 -0.0151091 -0.0024748
text467 -0.0381797 0.0046105 0.0029785 -0.0059893
text468 -0.0370565 -0.0065224 0.0154765 0.0075303
text469 -0.0157575 0.0178529 0.0089799 0.0005893
text470 -0.0227737 -0.0036131 0.0139331 0.0000915
text471 -0.0239579 0.0176869 -0.0004798 0.0061724
text472 -0.0447799 0.0693401 -0.0017255 -0.0019128
text473 -0.0288574 0.0202104 0.0021176 -0.0096441
text474 -0.0248917 0.0056054 0.0050577 0.0032272
text475 -0.0182068 0.0081161 0.0085920 0.0009705
text476 -0.0135282 0.0028210 0.0046034 0.0036021
text477 -0.0307343 0.0243107 0.0090646 -0.0004756
text478 -0.0157426 -0.0050293 0.0001533 0.0000325
text479 -0.0362705 0.0337349 0.0124329 0.0108838
text480 -0.0319205 0.0195396 0.0057453 -0.0014155
text481 -0.0436740 0.0013801 0.0185265 0.0163460
text482 -0.0350499 0.0070427 0.0026826 0.0045289
text483 -0.0295934 0.0058001 0.0063015 -0.0087619
text484 -0.0256519 -0.0156483 0.0056991 -0.0032405
text485 -0.0313463 0.0012904 -0.0021937 0.0054997
text486 -0.0203515 0.0168011 -0.0079139 -0.0002280
text487 -0.0361894 0.0296649 0.0083715 -0.0076515
text488 -0.0280840 0.0140795 -0.0087856 -0.0028778
text489 -0.0375237 0.0046811 0.0072350 -0.0040557
text490 -0.0197599 0.0022087 -0.0010141 0.0039289
text491 -0.0217960 -0.0066011 0.0018197 0.0047320
text492 -0.0373355 -0.0063200 0.0112068 -0.0026115
text493 -0.0277355 -0.0070162 0.0131106 0.0002326
text494 -0.0283453 -0.0023749 0.0135732 0.0005430
text495 -0.0276783 -0.0063803 0.0097049 0.0056756
text496 -0.0110301 -0.0047188 0.0049469 0.0013836
text497 -0.0244848 -0.0101101 0.0172853 -0.0078221
text498 -0.0217014 0.0150337 0.0063324 -0.0011809
text499 -0.0144972 0.0013937 0.0025498 -0.0045414
text500 -0.0094114 0.0086508 -0.0005578 -0.0050213
text501 -0.0140966 -0.0040265 0.0047493 -0.0056302
text502 -0.0045781 -0.0024669 0.0041396 0.0005549
text503 -0.0233787 0.0796565 0.0136267 -0.0177833
text504 -0.0114724 0.0114057 0.0006780 0.0045538
text505 -0.0281060 0.0237340 0.0074736 0.0586374
text506 -0.0148976 0.0145505 0.0087224 0.0395671
text507 -0.0263081 0.0389945 0.0065692 -0.0029849
text508 -0.0347503 0.0424671 -0.0022325 0.0084769
text509 -0.0374911 0.0198416 -0.0045365 0.0104727
text510 -0.0248813 0.0242756 -0.0006424 0.0059742
text511 -0.0237294 0.0261954 0.0081201 0.0029460
text512 -0.0250632 -0.0092383 0.0166832 -0.0057769
text513 -0.0361599 0.0252441 0.0085145 -0.0052140
text514 -0.0386659 0.0534309 0.0132422 -0.0280940
text515 -0.0442523 0.0306240 0.0081311 0.0006005
text516 -0.0244035 -0.0038724 -0.0057477 0.0041367
text517 -0.0284411 0.0083439 0.0181926 -0.0049940
text518 -0.0193366 0.0022266 -0.0009657 0.0020277
text519 -0.0096786 -0.0049361 0.0003806 0.0033873
text520 -0.0272429 0.0146284 -0.0065132 0.0117403
text521 -0.0337371 -0.0005542 -0.0038355 -0.0007187
text522 -0.0247830 0.0299070 -0.0134166 0.0011380
text523 -0.0146033 0.0157690 -0.0029678 -0.0022481
text524 -0.0132432 -0.0007095 0.0069490 0.0081387
text525 -0.0350591 -0.0229517 0.0117507 -0.0074991
text526 -0.0179408 0.0024300 0.0044475 0.0060647
text527 -0.0068144 0.0012709 -0.0040186 0.0002826
text528 -0.0382889 0.0373283 0.0063939 -0.0074360
text529 -0.0283461 0.0160663 0.0091570 -0.0074812
text530 -0.0177400 0.0061002 -0.0104736 -0.0022727
text531 -0.0358196 0.0140757 0.0081749 -0.0150437
text532 -0.0191037 0.0131285 0.0012857 -0.0112855
text533 -0.0299113 0.0270838 0.0067707 -0.0025893
text534 -0.0396987 0.0079476 0.0163968 -0.0228786
text535 -0.0080157 -0.0013342 0.0075127 -0.0052591
text536 -0.0253852 0.0108708 -0.0334686 0.0089681
text537 -0.0205128 0.0145769 -0.0339513 0.0085667
text538 -0.0211721 0.0079518 -0.0283339 0.0094322
text539 -0.0295774 0.0121331 0.0072455 0.0054973
text540 -0.0313976 0.0446552 0.0062093 -0.0048568
text541 -0.0386684 0.0851508 0.0052531 -0.0213871
text542 -0.0202384 0.0175684 -0.0029087 -0.0022661
text543 -0.0008034 -0.0006098 -0.0005049 -0.0002135
text544 -0.0204554 0.0230374 0.0110221 0.0322704
text545 -0.0206951 0.0162447 0.0113845 0.0192862
text546 -0.0188548 0.0034648 0.0074926 -0.0040911
text547 -0.0192323 0.0019052 -0.0076023 0.0044758
text548 -0.0275417 0.0103989 0.0034622 0.0034630
text549 -0.0298303 -0.0044245 0.0128477 0.0025581
text550 -0.0195824 0.0115723 0.0028259 -0.0112475
text551 -0.0231336 0.0368448 -0.0155092 -0.0007508
text552 -0.0194409 0.0274285 -0.0078904 -0.0008303
text553 -0.0023700 0.0057738 0.0000104 -0.0019219
text554 -0.0231283 -0.0092821 0.0084314 0.0076335
text555 -0.0288619 0.0113309 -0.0027968 -0.0005844
text556 -0.0212677 0.0053466 0.0236614 0.0328677
text557 -0.0092966 0.0028491 -0.0008229 -0.0001201
text558 -0.0297835 0.0431637 -0.0048521 0.0043531
text559 -0.0169191 0.0267818 0.0021461 -0.0074446
text560 -0.0176403 -0.0014013 0.0021386 0.0317120
text561 -0.0295844 0.0430173 -0.0171222 -0.0259421
text562 -0.0139175 0.0181945 -0.0029633 -0.0155984
text563 -0.0305884 0.0047881 -0.0012311 0.0052565
text564 -0.0324379 -0.0079341 -0.0033075 0.0000155
text565 -0.0209217 0.0015413 -0.0108941 0.0080173
text566 -0.0239032 -0.0185531 -0.0150049 -0.0038569
text567 -0.0303902 -0.0051409 -0.0147514 -0.0126429
text568 -0.0265311 -0.0052810 -0.0181788 -0.0108556
text569 -0.0033218 0.0024654 -0.0036952 0.0011511
text570 -0.0030473 -0.0022839 -0.0008504 0.0003877
text571 -0.0254196 -0.0067455 -0.0543033 0.0034295
text572 -0.0415997 -0.0009183 -0.0953894 0.0166498
text573 -0.0310405 -0.0125625 -0.0619324 0.0099596
text574 -0.0349730 -0.0105032 -0.0755864 0.0109744
text575 -0.0308954 -0.0157798 -0.0921386 0.0180388
text576 -0.0314823 -0.0189713 -0.0893208 0.0088889
text577 -0.0255856 -0.0117152 -0.0721183 0.0058877
text578 -0.0103477 0.0038126 -0.0091717 -0.0043217
text579 -0.0198706 -0.0107579 -0.0407816 0.0070927
text580 -0.0116400 -0.0056292 -0.0352562 0.0031821
text581 -0.0247634 -0.0090350 -0.0676090 0.0071083
text582 -0.0134091 -0.0074782 -0.0435476 0.0044877
text583 -0.0143432 -0.0010923 -0.0045790 0.0045171
text584 -0.0162741 -0.0055155 -0.0018892 0.0094098
text585 -0.0107053 -0.0009944 -0.0014692 0.0037428
text586 -0.0200008 -0.0152915 -0.0105590 0.0066612
text587 -0.0277205 -0.0178881 -0.0153193 0.0117347
text588 -0.0183666 0.0046862 0.0058478 0.0023793
text589 -0.0095955 -0.0006155 -0.0000125 0.0063702
text590 -0.0195269 -0.0162399 0.0092653 0.0075958
text591 -0.0115605 -0.0060995 -0.0057652 0.0031906
text592 -0.0089238 -0.0083192 0.0036805 -0.0007627
text593 -0.0297424 -0.0186604 -0.0428669 0.0114051
text594 -0.0189275 -0.0033591 -0.0346528 0.0071383
text595 -0.0199380 -0.0005791 -0.0185516 0.0089402
text596 -0.0245511 -0.0063465 -0.0424018 0.0089686
text597 -0.0173461 -0.0022581 -0.0179175 -0.0004196
text598 -0.0096314 -0.0027804 -0.0192970 0.0064091
text599 -0.0131991 -0.0136217 -0.0167592 0.0050473
text600 -0.0115226 -0.0150708 -0.0097294 0.0005793
text601 -0.0296166 -0.0368766 -0.0282027 -0.0168263
text602 -0.0243948 -0.0199095 -0.0369116 -0.0040587
text603 -0.0151477 -0.0139138 -0.0295315 0.0024006
text604 -0.0351232 0.0178090 -0.0146494 -0.0022326
text605 -0.0675604 -0.0431899 -0.0261517 -0.0112327
text606 -0.0401909 -0.0361184 -0.0245238 -0.0085289
text607 -0.0037537 -0.0027434 0.0016946 -0.0012943
text608 -0.0368044 -0.0392150 -0.0921383 0.0064083
text609 -0.0313774 -0.0438785 -0.0945298 -0.0039105
text610 -0.0084325 -0.0095961 -0.0203220 0.0011167
text611 -0.0178532 -0.0109898 -0.0411616 0.0118437
text612 -0.0198154 -0.0143282 -0.0209815 0.0093238
text613 -0.0186866 -0.0016228 -0.0173361 0.0107335
text614 -0.0182913 -0.0060177 -0.0170819 0.0108766
text615 -0.0240965 -0.0150466 -0.0302058 0.0116094
text616 -0.0157389 -0.0107000 -0.0334775 0.0123625
text617 -0.0200863 -0.0199989 -0.0019882 -0.0080237
text618 -0.0148695 -0.0106523 -0.0142435 0.0014174
text619 -0.0220460 -0.0197333 -0.0112958 0.0024681
text620 -0.0033901 -0.0014542 0.0014396 0.0034213
text621 -0.0282344 -0.0207389 -0.0542786 0.0117646
text622 -0.0207476 -0.0140791 -0.0439007 0.0024167
text623 -0.0255209 -0.0172804 -0.0603191 0.0180679
text624 -0.0132388 0.0026903 -0.0247778 0.0028687
text625 -0.0057356 -0.0012373 -0.0070578 0.0011352
text626 -0.0189546 -0.0129884 -0.0249334 0.0041325
text627 -0.0239931 -0.0118835 -0.0129099 -0.0093483
text628 -0.0290676 -0.0318587 0.0431696 -0.0204419
text629 -0.0186308 -0.0181669 0.0043220 -0.0031709
text630 -0.0176434 -0.0086581 -0.0344405 0.0021770
text631 -0.0075696 -0.0078846 -0.0105753 0.0019422
text632 -0.0008749 -0.0026853 0.0033327 -0.0017533
text633 -0.0261941 -0.0213248 -0.0532772 0.0160753
text634 -0.0249747 -0.0194800 -0.0457917 0.0057984
text635 -0.0283240 -0.0105858 -0.0377137 0.0034831
text636 -0.0295568 -0.0132815 -0.0231233 0.0012943
text637 -0.0240302 -0.0105213 -0.0340723 0.0093925
text638 -0.0284796 -0.0000473 -0.0424607 0.0019301
text639 -0.0263840 -0.0150507 -0.0081622 -0.0076650
text640 -0.0249792 -0.0192842 -0.0064666 -0.0078058
text641 -0.0278571 -0.0236758 -0.0165348 -0.0083198
text642 -0.0253352 -0.0069309 -0.0562657 0.0082917
text643 -0.0393072 -0.0168197 -0.0695601 0.0069265
text644 -0.0286973 -0.0176591 -0.0245834 0.0047949
text645 -0.0208895 -0.0149460 -0.0246508 -0.0005831
text646 -0.0219232 -0.0147324 -0.0403502 0.0063467
text647 -0.0290849 -0.0134441 -0.0387282 0.0044807
text648 -0.0334020 -0.0098934 -0.0281736 0.0078711
text649 -0.0196424 -0.0026943 -0.0187465 0.0093467
text650 -0.0070581 -0.0030142 -0.0174211 0.0030135
text651 -0.0152581 -0.0048516 -0.0204816 -0.0004302
text652 -0.0095122 0.0010183 -0.0100317 -0.0051779
text653 -0.0288682 0.0009660 -0.0271674 -0.0019224
text654 -0.0257502 -0.0021454 -0.0388394 0.0055186
text655 -0.0214504 -0.0004040 -0.0191744 0.0053735
text656 -0.0260875 -0.0041983 -0.0508886 0.0042876
text657 -0.0161964 -0.0139898 0.0095276 0.0005718
text658 -0.0156901 -0.0090134 -0.0078428 0.0004437
text659 -0.0177932 -0.0010995 -0.0077549 -0.0038378
text660 -0.0187756 -0.0111673 -0.0204059 0.0066267
text661 -0.0188095 -0.0086335 -0.0408532 0.0036526
text662 -0.0148407 -0.0005660 -0.0113651 -0.0024980
text663 -0.0186676 -0.0132838 -0.0070143 0.0031546
text664 -0.0254753 -0.0154377 -0.0070949 0.0001703
text665 -0.0167038 -0.0119089 -0.0167509 0.0061930
text666 -0.0317447 -0.0362272 -0.0002603 -0.0121726
text667 -0.0293818 -0.0191133 0.0087751 -0.0086362
text668 -0.0131026 -0.0102468 0.0015940 -0.0026876
text669 -0.0353454 -0.0126038 -0.0417173 0.0110216
text670 -0.0363668 -0.0295760 -0.0574306 0.0075461
text671 -0.0309917 -0.0218167 -0.0140300 0.0049092
text672 -0.0313895 -0.0296187 -0.0312700 0.0117982
text673 -0.0193417 -0.0162216 0.0038298 -0.0115741
text674 -0.0261702 -0.0215936 -0.0170963 0.0027762
text675 -0.0121180 -0.0052675 -0.0124529 0.0124410
text676 -0.0068599 -0.0045926 -0.0055498 0.0039351
text677 -0.0318787 -0.0294245 -0.0170104 -0.0138988
text678 -0.0212047 -0.0067840 -0.0043575 -0.0009925
text679 -0.0183741 -0.0063826 -0.0099789 0.0051525
text680 -0.0148980 -0.0007936 -0.0049701 0.0021260
text681 -0.0306816 -0.0153864 -0.0239179 0.0020284
text682 -0.0429960 -0.0258537 -0.0337596 0.0021500
text683 -0.0316991 -0.0223333 -0.0164749 -0.0051337
text684 -0.0406882 -0.0238624 -0.0056043 -0.0080043
text685 -0.0379843 -0.0251395 0.0117592 0.0182424
text686 -0.0033700 0.0007964 -0.0014339 -0.0009769
text687 -0.0178345 -0.0125642 -0.0437031 0.0086577
text688 -0.0115835 -0.0146096 -0.0517722 0.0079639
text689 -0.0256443 -0.0213330 0.0020058 -0.0020106
text690 -0.0197332 -0.0056357 -0.0040980 -0.0012912
text691 -0.0163676 -0.0131321 -0.0327350 0.0062415
text692 -0.0251954 -0.0095519 -0.0184438 0.0103278
text693 -0.0116446 -0.0076854 -0.0245494 0.0051851
text694 -0.0142326 -0.0124860 -0.0214792 0.0062627
text695 -0.0126779 -0.0061777 -0.0174143 0.0048102
text696 -0.0110865 -0.0051147 -0.0062890 -0.0047998
text697 -0.0331160 -0.0411410 -0.0718921 0.0062145
text698 -0.0239055 -0.0235935 -0.0554600 0.0007555
text699 -0.0272024 -0.0147750 -0.0129421 -0.0055075
text700 -0.0065044 -0.0056978 -0.0027511 -0.0017365
text701 -0.0217810 -0.0095650 -0.0219531 0.0075208
text702 -0.0440649 -0.0509003 0.0323016 -0.0209707
text703 -0.0201578 -0.0210322 -0.0027449 -0.0042914
text704 -0.0247688 -0.0231683 0.0054968 -0.0029269
text705 -0.0293377 -0.0116425 -0.0203417 0.0013249
text706 -0.0051347 -0.0060254 0.0041318 -0.0013946
text707 -0.0163285 -0.0166535 -0.0334281 0.0050814
text708 -0.0243203 -0.0253437 -0.0212124 -0.0014201
text709 -0.0438015 -0.0450725 -0.0796292 -0.0059414
text710 -0.0279042 -0.0119150 -0.0292455 0.0095119
text711 -0.0215161 -0.0199165 -0.0340533 0.0041594
text712 -0.0252455 -0.0305333 -0.0536555 0.0024081
text713 -0.0169979 -0.0111005 -0.0372485 0.0054777
text714 -0.0213001 -0.0079848 -0.0317699 0.0059306
text715 -0.0273414 0.0023104 -0.0332481 0.0136908
text716 -0.0507055 0.0224583 -0.0271341 0.0388487
text717 -0.0136097 0.0006070 -0.0057396 0.0094533
text718 -0.0253773 -0.0267630 -0.0511202 0.0024740
text719 -0.0301538 -0.0395669 -0.0180177 -0.0154648
text720 -0.0250349 -0.0207119 -0.0474611 -0.0074283
text721 -0.0196365 -0.0195552 -0.0484179 0.0032233
text722 -0.0202230 -0.0064905 0.0033976 0.0126435
text723 -0.0248914 -0.0078798 -0.0025009 0.0110019
text724 -0.0251618 -0.0059128 -0.0021819 0.0146205
text725 -0.0141406 -0.0039526 -0.0083820 0.0096963
text726 -0.0240566 -0.0177577 -0.0262312 0.0116534
text727 -0.0125605 -0.0097599 -0.0136072 0.0092323
text728 -0.0217281 -0.0161630 -0.0106272 0.0001574
text729 -0.0182872 -0.0054763 -0.0007325 -0.0010461
text730 -0.0002734 -0.0002192 0.0000961 0.0000464
text731 -0.0103761 -0.0079662 -0.0318503 0.0030332
text732 -0.0307515 -0.0321830 -0.0271880 -0.0026653
text733 -0.0217375 -0.0173775 -0.0371818 0.0052425
text734 -0.0221838 -0.0297261 -0.0248450 -0.0028684
text735 -0.0140497 -0.0177052 -0.0489023 0.0072818
text736 -0.0167026 -0.0157113 -0.0414246 0.0011984
text737 -0.0235035 -0.0206541 -0.0200065 -0.0046091
text738 -0.0261288 -0.0116214 -0.0261968 0.0142357
text739 -0.0318719 -0.0266200 -0.0454607 0.0104394
text740 -0.0055412 -0.0009467 -0.0046639 0.0018387
text741 -0.0208389 -0.0118906 0.0060387 0.0027179
text742 -0.0291290 -0.0059830 -0.0023843 0.0105181
text743 -0.0328328 -0.0406291 0.0240249 -0.0195563
text744 -0.0204997 -0.0144909 0.0111989 -0.0001391
text745 -0.0142415 0.0053529 0.0067015 0.0001887
text746 -0.0089883 0.0024959 -0.0008041 0.0005323
text747 -0.0155982 -0.0110535 -0.0403387 0.0103321
text748 -0.0221247 -0.0217029 -0.0522801 0.0047386
text749 -0.0149560 -0.0162470 -0.0372814 0.0035718
text750 -0.0176823 -0.0085849 -0.0386483 0.0031987
text751 -0.0003365 -0.0002558 -0.0000199 0.0001437
text752 -0.0238939 -0.0158828 -0.0373095 0.0096198
text753 -0.0105210 -0.0055317 -0.0161641 0.0030725
text754 -0.0313419 -0.0277680 -0.0441326 0.0031155
text755 -0.0308777 -0.0257353 -0.0453577 -0.0032067
text756 -0.0220349 -0.0106017 -0.0133803 0.0008087
text757 -0.0274308 -0.0286051 -0.0186775 -0.0056694
text758 -0.0398674 -0.0394068 -0.0138207 -0.0081378
text759 -0.0257214 -0.0003084 -0.0633983 -0.0029752
text760 -0.0229535 -0.0144829 -0.0450480 0.0093578
text761 -0.0259620 0.0017904 -0.0569930 -0.0054914
text762 -0.0249985 -0.0128730 -0.0412423 -0.0011730
text763 -0.0072991 -0.0026772 -0.0159665 0.0012313
text764 -0.0226840 -0.0217047 0.0053058 0.0004604
text765 -0.0344423 -0.0390416 -0.0094770 0.0051668
text766 -0.0325140 -0.0032545 0.0257782 -0.0218618
text767 -0.0318432 -0.0218706 -0.0216474 -0.0077556
text768 -0.0270420 -0.0201154 -0.0097134 -0.0053867
text769 -0.0367457 -0.0293695 -0.0318452 0.0066681
text770 -0.0063854 -0.0090677 0.0068944 -0.0043506
text771 -0.0250981 -0.0173322 -0.0607137 0.0082321
text772 -0.0255624 -0.0267789 -0.0550091 0.0048732
text773 -0.0300521 -0.0236439 -0.0433128 0.0066064
text774 -0.0112523 -0.0141331 -0.0375952 0.0087280
text775 -0.0257900 -0.0139835 -0.0304923 0.0070274
text776 -0.0246280 -0.0216434 -0.0266219 0.0041906
text777 -0.0228470 -0.0269490 0.0068018 -0.0081006
text778 -0.0409947 -0.0277038 -0.0196627 -0.0083452
text779 -0.0114331 -0.0112488 -0.0061746 -0.0033897
text780 -0.0246879 -0.0115357 -0.0099945 0.0017486
text781 -0.0023671 -0.0025270 0.0010213 -0.0010836
text782 -0.0287369 -0.0213773 -0.0477749 0.0143294
text783 -0.0230022 -0.0209208 -0.0410555 0.0101349
text784 -0.0110007 -0.0065607 -0.0215179 0.0060629
text785 -0.0234052 -0.0258630 -0.0471569 0.0052771
text786 -0.0162431 -0.0185663 -0.0396498 0.0083648
text787 -0.0270680 -0.0239115 -0.0540500 0.0095073
text788 -0.0204753 -0.0160414 -0.0330254 0.0033798
text789 -0.0185104 -0.0133332 -0.0285876 0.0024876
text790 -0.0272949 -0.0252424 -0.0240566 0.0033818
text791 -0.0300508 -0.0067493 -0.0008823 0.0081568
text792 -0.0220451 -0.0152468 0.0043259 -0.0008541
text793 -0.0157339 -0.0158959 -0.0021959 0.0053363
text794 -0.0167084 -0.0052941 -0.0235863 0.0111901
text795 -0.0181417 -0.0190613 -0.0555111 0.0074162
text796 -0.0062286 -0.0071703 -0.0168161 0.0044107
text797 -0.0181717 -0.0138710 -0.0362634 0.0128752
text798 -0.0120627 -0.0080472 -0.0235416 0.0094299
text799 -0.0184869 -0.0159584 -0.0243669 0.0056297
text800 -0.0095557 -0.0044872 -0.0026936 0.0054902
text801 -0.0104461 -0.0071135 -0.0089762 0.0052013
text802 -0.0134059 -0.0065027 -0.0030741 0.0014428
text803 -0.0236640 -0.0270549 -0.0346888 0.0016626
text804 -0.0291386 -0.0212644 -0.0529577 0.0046548
text805 -0.0261450 -0.0228853 -0.0363999 0.0007377
text806 -0.0231196 0.0034307 -0.0559420 0.0069367
text807 -0.0087509 0.0032285 -0.0110876 -0.0005103
text808 -0.0130605 -0.0161695 -0.0403477 0.0058520
text809 -0.0281127 -0.0288329 -0.0914152 0.0179728
text810 -0.0242249 -0.0267278 -0.0615670 0.0086141
text811 -0.0242894 -0.0131616 -0.0545373 0.0111964
text812 -0.0249738 0.0117238 -0.0381514 -0.0117022
text813 -0.0372078 0.0259144 -0.0647108 -0.0329797
text814 -0.0282585 0.0141437 -0.0551581 -0.0101855
text815 -0.0142636 0.0073679 -0.0284143 -0.0034889
text816 -0.0221739 0.0035560 -0.0097394 0.0040574
text817 -0.0213579 -0.0080142 -0.0010782 0.0025801
text818 -0.0110713 -0.0110087 -0.0239643 0.0050842
text819 -0.0119998 -0.0041109 -0.0008312 0.0020238
text820 -0.0174976 -0.0176366 0.0045640 -0.0038476
text821 -0.0265416 -0.0128678 -0.0034525 0.0045337
text822 -0.0170792 -0.0052487 -0.0160404 0.0076725
text823 -0.0119603 -0.0073002 0.0015167 0.0019879
text824 -0.0088133 -0.0028057 -0.0045393 0.0035032
text825 -0.0007254 -0.0015012 -0.0047478 0.0006645
text826 -0.0291937 0.0007550 -0.0752154 0.0097422
text827 -0.0258416 -0.0197223 -0.0642005 0.0171389
text828 -0.0268698 0.0009738 -0.0561612 0.0115985
text829 -0.0308647 -0.0140341 -0.0468128 -0.0040811
text830 -0.0140170 -0.0005162 -0.0097782 -0.0019933
text831 -0.0374563 -0.0310013 -0.0184682 -0.0063199
text832 -0.0435357 -0.0205322 -0.0238136 0.0036042
text833 -0.0410213 -0.0077093 -0.0542770 0.0015111
text834 -0.0327276 -0.0171857 -0.0773719 -0.0010295
text835 -0.0398451 -0.0076245 -0.0515509 0.0067856
text836 -0.0174905 -0.0103094 -0.0155807 -0.0008209
text837 -0.0093658 -0.0071321 -0.0206612 0.0030554
text838 -0.0234194 -0.0063738 -0.0317421 0.0099844
text839 -0.0174626 -0.0010600 -0.0246669 0.0095145
text840 -0.0134556 0.0000886 -0.0248233 0.0050131
text841 -0.0180200 -0.0162425 -0.0455879 0.0050512
text842 -0.0178529 -0.0138989 -0.0286606 0.0016665
text843 -0.0186503 -0.0135428 -0.0439949 0.0067326
text844 -0.0330403 -0.0219064 -0.0464891 0.0017734
text845 -0.0347152 -0.0280631 -0.0635191 -0.0042524
text846 -0.0043405 -0.0017015 -0.0078770 0.0002209
text847 -0.0240510 -0.0086199 -0.0192074 -0.0046285
text848 -0.0299511 -0.0154345 -0.0731904 0.0081947
text849 -0.0259783 -0.0262768 -0.0825090 0.0046876
text850 -0.0250407 -0.0227242 -0.0639937 0.0029839
text851 -0.0002872 -0.0002396 0.0001031 0.0000393
text852 -0.0354031 -0.0000930 -0.0477127 0.0062030
text853 -0.0342267 0.0075723 -0.0316688 0.0089985
text854 -0.0332866 -0.0132643 -0.0556133 0.0104983
text855 -0.0163143 -0.0069402 -0.0080226 0.0056928
text856 -0.0158569 0.0033748 -0.0297363 0.0030731
text857 -0.0226079 -0.0109817 -0.0676568 0.0139221
text858 -0.0189206 -0.0028855 -0.0625252 0.0083838
text859 -0.0292370 -0.0080437 -0.0666998 0.0025766
text860 -0.0339525 -0.0300232 -0.0695655 0.0027974
text861 -0.0165379 -0.0181290 -0.0252623 0.0098393
text862 -0.0200581 -0.0191799 -0.0660759 0.0034478
text863 -0.0355532 -0.0418502 -0.0485124 -0.0129489
text864 -0.0040197 -0.0037470 0.0027846 -0.0021060
text865 -0.0092222 -0.0104798 -0.0067766 -0.0027969
text866 -0.0210725 -0.0135337 -0.0373636 0.0050228
text867 -0.0253325 -0.0120135 -0.0412900 0.0069376
text868 -0.0203687 -0.0101152 -0.0576526 0.0082356
text869 -0.0226747 -0.0316318 -0.1001747 0.0159847
text870 -0.0196377 -0.0059242 -0.0626730 0.0038382
text871 -0.0297333 -0.0301460 -0.0409697 0.0118126
text872 -0.0302028 -0.0168988 -0.0651069 0.0056686
text873 -0.0322966 -0.0243903 -0.0610873 0.0148815
text874 -0.0145459 -0.0113007 -0.0248381 0.0089642
text875 -0.0309109 -0.0293697 -0.0822278 0.0110035
text876 -0.0241582 -0.0191530 -0.0696316 0.0125359
text877 -0.0126325 -0.0149790 -0.0388443 0.0049787
text878 -0.0427579 -0.0473676 -0.0995944 0.0038078
text879 -0.0073875 -0.0089186 -0.0100753 0.0004033
text880 -0.0391150 -0.0426468 -0.0399896 -0.0033735
text881 -0.0343836 -0.0253447 -0.0049330 0.0007891
text882 -0.0339413 -0.0335972 -0.0209121 -0.0027975
text883 -0.0064514 -0.0007839 -0.0141162 0.0017345
text884 -0.0171627 -0.0181548 -0.0269702 0.0060262
text885 -0.0206666 -0.0110379 -0.0263627 0.0074540
text886 -0.0206734 -0.0183648 -0.0119628 -0.0060703
text887 -0.0207066 -0.0155331 -0.0345383 -0.0054519
text888 -0.0101435 -0.0060456 -0.0096442 -0.0018103
text889 -0.0300701 -0.0211029 -0.0645018 -0.0021909
text890 -0.0399696 -0.0057900 -0.0440921 -0.0118256
text891 -0.0277793 -0.0245911 -0.0987672 -0.0016017
text892 -0.0011158 -0.0010844 -0.0019564 0.0010115
text893 -0.0205613 -0.0096583 -0.0419812 0.0079625
text894 -0.0365594 -0.0062950 -0.0871302 0.0303236
text895 -0.0264880 0.0031134 -0.0430303 0.0074945
text896 -0.0141952 -0.0036915 -0.0282469 0.0078085
text897 -0.0172569 -0.0019964 -0.0454974 -0.0002566
text898 -0.0055544 0.0021346 -0.0059836 -0.0015388
text899 -0.0184882 -0.0096826 -0.0270811 0.0053318
text900 -0.0188169 -0.0077380 -0.0140337 -0.0041507
text901 -0.0181958 -0.0091021 0.0038981 0.0020740
text902 -0.0097807 -0.0039097 -0.0099177 0.0081414
text903 -0.0187350 -0.0160944 -0.0280648 0.0017192
text904 -0.0081148 -0.0090510 -0.0080471 0.0001756
text905 -0.0268411 -0.0109000 -0.0310650 0.0104022
text906 -0.0204626 0.0009244 -0.0479797 0.0121036
text907 -0.0147972 -0.0073699 -0.0321348 0.0084195
text908 -0.0218845 -0.0173402 -0.0688406 0.0107036
text909 -0.0181737 -0.0094903 -0.0376972 0.0171485
text910 -0.0082768 -0.0086804 -0.0095795 0.0003989
text911 -0.0168424 -0.0211593 -0.0773830 0.0061084
text912 -0.0316961 -0.0133290 -0.0724139 0.0010857
text913 -0.0328313 -0.0144705 -0.1040172 0.0002111
text914 -0.0191959 -0.0116610 -0.0338254 0.0082706
text915 -0.0205816 -0.0171793 -0.0395825 0.0028247
text916 -0.0187853 -0.0276759 -0.0228502 -0.0058773
text917 -0.0347473 -0.0318730 -0.0668758 0.0009507
text918 -0.0271167 -0.0192885 -0.0291921 0.0047020
text919 -0.0166349 -0.0151023 -0.0397307 0.0035898
text920 -0.0282612 -0.0158691 -0.0662775 0.0045525
text921 -0.0251312 -0.0090074 -0.0134323 -0.0034945
text922 -0.0186395 -0.0251137 -0.0268148 0.0013134
text923 -0.0165379 -0.0181290 -0.0252623 0.0098393
text924 -0.0200581 -0.0191799 -0.0660759 0.0034478
text925 -0.0355532 -0.0418502 -0.0485124 -0.0129489
text926 -0.0040197 -0.0037470 0.0027846 -0.0021060
text927 -0.0347473 -0.0318730 -0.0668758 0.0009507
text928 -0.0271167 -0.0192885 -0.0291921 0.0047020
text929 -0.0166349 -0.0151023 -0.0397307 0.0035898
text930 -0.0282612 -0.0158691 -0.0662775 0.0045525
text931 -0.0251312 -0.0090074 -0.0134323 -0.0034945
text932 -0.0186395 -0.0251137 -0.0268148 0.0013134
text933 -0.0253364 -0.0179823 -0.0756588 0.0075839
text934 -0.0413238 -0.0450061 -0.1064322 -0.0008177
text935 -0.0053250 -0.0028966 -0.0203556 0.0026190
text936 -0.0246221 -0.0104997 -0.0817678 0.0148945
text937 -0.0156260 -0.0129190 -0.0605708 0.0088637
text938 -0.0168424 -0.0211593 -0.0773830 0.0061084
text939 -0.0316961 -0.0133290 -0.0724139 0.0010857
text940 -0.0328313 -0.0144705 -0.1040172 0.0002111
text941 -0.0191959 -0.0116610 -0.0338254 0.0082706
text942 -0.0205816 -0.0171793 -0.0395825 0.0028247
text943 -0.0187853 -0.0276759 -0.0228502 -0.0058773
text944 -0.0171627 -0.0181548 -0.0269702 0.0060262
text945 -0.0206666 -0.0110379 -0.0263627 0.0074540
text946 -0.0206734 -0.0183648 -0.0119628 -0.0060703
text947 -0.0207066 -0.0155331 -0.0345383 -0.0054519
text948 -0.0101435 -0.0060456 -0.0096442 -0.0018103
text949 -0.0285206 -0.0122887 -0.0358505 0.0094772
text950 -0.0214521 0.0040255 -0.0058889 0.0071715
text951 -0.0208959 -0.0035102 -0.0152927 0.0147653
text952 -0.0196488 -0.0044791 -0.0528934 0.0102331
text953 -0.0182883 -0.0200285 -0.0694057 0.0123408
text954 -0.0249253 -0.0085024 -0.0432266 0.0114850
text955 -0.0229805 -0.0031223 -0.0333391 0.0068808
text956 -0.0064666 -0.0012948 -0.0058669 0.0005619
text957 -0.0163111 -0.0115031 -0.0315518 0.0104993
text958 -0.0132041 0.0030236 -0.0188222 -0.0006064
text959 -0.0210130 -0.0174595 0.0083811 -0.0008428
text960 -0.0167568 -0.0057841 -0.0123284 0.0059048
text961 -0.0304109 -0.0130437 -0.0268415 -0.0021811
text962 -0.0219343 -0.0092149 -0.0491761 0.0046681
text963 -0.0123540 0.0012404 -0.0062043 -0.0045109
text964 -0.0238948 -0.0069086 -0.0417913 0.0146149
text965 -0.0144630 -0.0086483 -0.0331443 0.0115331
text966 -0.0182867 -0.0021175 -0.0183361 0.0103607
text967 -0.0086448 -0.0077075 -0.0225850 0.0049049
text968 -0.0269748 -0.0111008 -0.0206352 0.0113117
text969 -0.0211349 -0.0096055 -0.0223207 0.0070096
text970 -0.0255416 -0.0191659 -0.0318562 0.0126993
text971 -0.0278116 -0.0281338 0.0239548 -0.0162269
text972 -0.0157893 -0.0150175 0.0135764 -0.0142609
text973 -0.0080500 -0.0126419 0.0122792 -0.0053570
text974 -0.0122513 -0.0129035 -0.0008349 0.0025687
text975 -0.0048812 -0.0051414 -0.0016636 0.0010319
text976 -0.0427143 -0.0204899 0.0465079 -0.0210583
text977 -0.0130586 0.0036905 0.0157917 -0.0004372
text978 -0.0162711 0.0052425 0.0081006 0.0063403
text979 -0.0101999 0.0049494 0.0004427 0.0046587
text980 -0.0196596 -0.0005344 0.0119645 0.0043269
text981 -0.0155899 0.0072004 0.0082128 0.0108938
text982 -0.0200997 0.0071630 0.0045344 0.0068607
text983 -0.0271222 0.0143427 0.0186385 0.0000522
text984 -0.0192048 0.0129309 0.0180712 0.0085216
text985 -0.0427765 -0.0168914 0.0335288 -0.0144004
text986 -0.0331293 -0.0189951 0.0252656 -0.0075622
text987 -0.0202849 -0.0139742 0.0117392 -0.0039394
text988 -0.0327974 -0.0323124 0.0446035 -0.0295273
text989 -0.0437016 -0.0428602 0.0263534 -0.0235386
text990 -0.0534911 -0.0516011 0.0270781 -0.0280755
text991 -0.0347185 -0.0250749 0.0151293 -0.0200802
text992 -0.0440171 -0.0110749 0.0302682 -0.0166246
text993 -0.0364797 -0.0366919 -0.0036104 -0.0061204
text994 -0.0008475 -0.0006209 -0.0003973 -0.0005861
text995 -0.0459199 0.0024594 0.0040588 -0.0086394
text996 -0.0222983 -0.0088770 0.0070632 0.0000370
text997 -0.0199133 -0.0241320 0.0191006 -0.0060704
text998 -0.0182670 -0.0168302 0.0203179 0.0003837
text999 -0.0253510 -0.0202274 0.0228139 0.0030059
text1000 -0.0148204 -0.0153495 0.0098272 -0.0029220
text1001 -0.0106957 -0.0087951 -0.0023483 -0.0002370
text1002 -0.0296979 -0.0133978 0.0095543 -0.0013670
text1003 -0.0259599 -0.0261526 0.0298540 -0.0116642
text1004 -0.0186161 -0.0162424 0.0131340 -0.0097736
text1005 -0.0148105 -0.0101185 0.0109881 -0.0008418
text1006 -0.0106093 -0.0049074 0.0095107 -0.0064780
text1007 -0.0110811 -0.0039219 0.0070118 0.0060488
text1008 -0.0343406 -0.0395741 0.0272883 -0.0066680
text1009 -0.0375276 -0.0482823 0.0314108 -0.0242650
text1010 -0.0266996 -0.0178792 0.0136309 -0.0147277
text1011 -0.0410336 -0.0407872 0.0311239 -0.0315325
text1012 -0.0371324 -0.0030344 0.0358897 -0.0243311
text1013 -0.0200584 -0.0184960 0.0195454 -0.0137290
text1014 -0.0167595 -0.0216177 0.0154511 -0.0112841
text1015 -0.0274121 -0.0140946 0.0247936 -0.0024696
text1016 -0.0288419 -0.0120511 0.0037117 -0.0030342
text1017 -0.0244102 -0.0126251 0.0112413 0.0014102
text1018 -0.0107993 -0.0094465 0.0127992 -0.0014810
text1019 -0.0236294 -0.0240361 0.0179999 -0.0007793
text1020 -0.0170284 -0.0071229 0.0070251 -0.0015913
text1021 -0.0173950 -0.0176527 0.0052537 -0.0012543
text1022 -0.0292867 -0.0343303 0.0129552 -0.0084455
text1023 -0.0350570 -0.0369253 0.0152313 -0.0149223
text1024 -0.0364354 -0.0438409 0.0512240 -0.0232841
text1025 -0.0177159 -0.0229580 0.0322009 -0.0100148
text1026 -0.0203631 -0.0132749 0.0246037 -0.0067218
text1027 -0.0331856 -0.0332235 0.0393552 -0.0223275
text1028 -0.0260436 -0.0176874 0.0018306 0.0006090
text1029 -0.0180035 -0.0203534 0.0166533 -0.0094436
text1030 -0.0400319 -0.0333639 0.0379429 -0.0210069
text1031 -0.0307489 -0.0355127 0.0503388 -0.0296360
text1032 -0.0168494 -0.0131942 0.0079250 -0.0083329
text1033 -0.0194875 -0.0269629 0.0282690 -0.0095469
text1034 -0.0274669 -0.0263252 0.0430573 -0.0191661
text1035 -0.0167799 -0.0066639 0.0103569 -0.0025784
text1036 -0.0137971 -0.0049301 0.0068973 0.0044464
text1037 -0.0245749 -0.0131535 0.0236041 0.0108875
text1038 -0.0033231 -0.0003986 -0.0001715 0.0070108
text1039 -0.0087714 -0.0025431 0.0012725 0.0059852
text1040 -0.0224213 -0.0172483 0.0199918 -0.0018369
text1041 -0.0088398 -0.0130255 0.0203162 -0.0069832
text1042 -0.0188657 -0.0058812 0.0105446 0.0096387
text1043 -0.0189454 -0.0089958 0.0192648 -0.0098962
text1044 -0.0006962 -0.0002344 0.0005250 0.0010196
text1045 -0.0185020 -0.0136593 0.0113492 -0.0016568
text1046 -0.0227613 -0.0214204 0.0186286 -0.0047883
text1047 -0.0301263 -0.0170928 0.0191518 -0.0044547
text1048 -0.0425327 -0.0466923 0.0558651 -0.0246546
text1049 -0.0281420 -0.0385997 0.0440817 -0.0168740
text1050 -0.0129889 -0.0156135 0.0144322 -0.0058495
text1051 -0.0374100 -0.0218957 0.0215193 -0.0122758
text1052 -0.0213398 -0.0073919 0.0073641 0.0079071
text1053 -0.0238665 -0.0125220 0.0106663 0.0044977
text1054 -0.0207800 -0.0083133 0.0141805 -0.0022193
text1055 -0.0344362 -0.0132144 0.0210267 -0.0018803
text1056 -0.0256428 -0.0198046 0.0112224 0.0029558
text1057 -0.0013663 -0.0015318 0.0000116 -0.0004548
text1058 -0.0218616 -0.0056239 0.0030580 -0.0071827
text1059 -0.0135900 -0.0028847 -0.0060158 0.0053515
text1060 -0.0199968 -0.0167813 0.0136490 0.0014655
text1061 -0.0145692 -0.0058229 0.0051537 -0.0038851
text1062 -0.0133181 -0.0026586 0.0031714 -0.0008538
text1063 -0.0147559 -0.0108545 0.0124960 -0.0064305
text1064 -0.0169934 -0.0099646 0.0089062 -0.0028468
text1065 -0.0110942 -0.0016542 0.0066818 -0.0031153
text1066 -0.0301082 -0.0267304 0.0212238 -0.0123914
text1067 -0.0357462 -0.0350905 0.0397961 -0.0236768
text1068 -0.0191147 -0.0263595 0.0301517 -0.0148957
text1069 -0.0140342 -0.0143280 0.0112016 0.0004743
text1070 -0.0330694 -0.0325035 0.0315380 -0.0155135
text1071 -0.0193996 -0.0149025 0.0154405 -0.0014897
text1072 -0.0314103 -0.0160499 0.0152176 0.0043838
text1073 -0.0191822 -0.0162118 0.0171189 0.0011733
text1074 -0.0061031 -0.0026677 0.0033547 0.0038679
text1075 -0.0149183 -0.0103062 0.0025874 0.0012922
text1076 -0.0225727 -0.0213056 0.0032175 -0.0009677
text1077 -0.0127823 -0.0179147 0.0112895 -0.0038460
text1078 -0.0242090 -0.0276832 0.0264788 -0.0101104
text1079 -0.0102434 -0.0026500 0.0025668 -0.0011791
text1080 -0.0280601 -0.0228671 0.0229762 -0.0070365
text1081 -0.0211541 -0.0212542 0.0148841 0.0006800
text1082 -0.0116575 -0.0127642 0.0104312 -0.0035303
text1083 -0.0127823 -0.0179147 0.0112895 -0.0038460
text1084 -0.0242090 -0.0276832 0.0264788 -0.0101104
text1085 -0.0102434 -0.0026500 0.0025668 -0.0011791
text1086 -0.0280601 -0.0228671 0.0229762 -0.0070365
text1087 -0.0211541 -0.0212542 0.0148841 0.0006800
text1088 -0.0116575 -0.0127642 0.0104312 -0.0035303
text1089 -0.0120955 0.0022138 0.0038820 -0.0049866
text1090 -0.0339960 0.0082598 0.0133788 -0.0034171
text1091 -0.0230172 -0.0197519 0.0250411 -0.0169886
text1092 -0.0215586 0.0217395 0.0071620 -0.0090691
text1093 -0.0199282 -0.0078247 0.0165786 -0.0127473
text1094 -0.0260822 -0.0066576 0.0175457 -0.0118609
text1095 -0.0075181 -0.0040152 0.0118922 -0.0060544
text1096 -0.0135084 -0.0022289 0.0096303 0.0012631
text1097 -0.0252297 -0.0173488 0.0175742 -0.0049624
text1098 -0.0180520 -0.0124605 0.0183317 0.0008588
text1099 -0.0122480 -0.0104971 0.0140668 0.0003056
text1100 -0.0169052 -0.0207213 0.0032732 -0.0024844
text1101 -0.0086570 -0.0130601 0.0156473 -0.0037348
text1102 -0.0185043 -0.0223589 0.0253971 -0.0097434
text1103 -0.0207532 -0.0109373 0.0203626 -0.0072246
text1104 -0.0385905 -0.0329470 0.0363309 -0.0260472
text1105 -0.0271036 -0.0183017 0.0219703 -0.0084023
text1106 -0.0347232 -0.0120711 0.0307205 -0.0250650
text1107 -0.0244134 -0.0173481 0.0288205 -0.0189699
text1108 -0.0338459 -0.0203859 0.0271904 -0.0037017
text1109 -0.0265995 -0.0145913 0.0197746 -0.0048818
text1110 -0.0243288 -0.0233355 0.0239065 -0.0101794
text1111 -0.0134916 -0.0106406 0.0102408 -0.0012468
text1112 -0.0285933 -0.0061588 0.0150182 -0.0143522
text1113 -0.0276880 0.0006969 0.0135998 -0.0137279
text1114 -0.0293462 -0.0107327 0.0300820 -0.0158101
text1115 -0.0429433 -0.0263429 0.0366126 -0.0240406
text1116 -0.0414755 -0.0381183 0.0505959 -0.0126749
text1117 -0.0197483 -0.0047595 0.0112585 -0.0032263
text1118 -0.0339208 -0.0310401 0.0256710 -0.0104684
text1119 -0.0227860 -0.0149656 0.0128910 -0.0041926
text1120 -0.0186545 -0.0088624 0.0038937 -0.0014272
text1121 -0.0280833 -0.0026894 0.0078399 -0.0081628
text1122 -0.0406032 -0.0472599 0.0473861 -0.0378818
text1123 -0.0230989 -0.0371119 0.0211500 -0.0219859
text1124 -0.0268936 -0.0247678 0.0122605 -0.0156171
text1125 -0.0290895 -0.0385634 0.0322899 -0.0194282
text1126 -0.0331620 -0.0196299 0.0197281 -0.0138431
text1127 -0.0246749 -0.0194719 0.0166857 -0.0112171
text1128 -0.0260543 -0.0349584 0.0306373 -0.0265864
text1129 -0.0201163 -0.0161950 0.0129719 -0.0090445
text1130 -0.0115481 -0.0094217 -0.0029372 0.0068683
text1131 -0.0178616 -0.0053795 -0.0283041 0.0119091
text1132 -0.0099561 -0.0046184 -0.0188880 0.0047869
text1133 -0.0204328 -0.0144405 0.0236294 -0.0034613
text1134 -0.0253171 -0.0093082 0.0190932 0.0010040
text1135 -0.0243174 -0.0141616 0.0102640 -0.0117772
text1136 -0.0279239 -0.0249203 0.0299535 -0.0075017
text1137 -0.0182281 -0.0160537 0.0169322 -0.0062249
text1138 -0.0094893 -0.0005915 0.0069770 -0.0037016
text1139 -0.0244850 -0.0203092 0.0099481 -0.0068665
text1140 -0.0195475 -0.0148625 0.0058123 -0.0022997
text1141 -0.0185958 -0.0044579 0.0048772 -0.0014560
text1142 -0.0152063 -0.0083257 0.0065750 0.0034493
text1143 -0.0280628 -0.0053093 0.0140783 -0.0073832
text1144 -0.0306228 -0.0144019 0.0052091 -0.0060603
text1145 -0.0060628 -0.0021983 0.0029448 0.0013403
text1146 -0.0223725 -0.0153755 0.0203882 -0.0101544
text1147 -0.0260141 -0.0152868 0.0196228 -0.0105586
text1148 -0.0352998 -0.0349879 0.0450944 -0.0290702
text1149 -0.0219808 -0.0226549 0.0156450 -0.0041725
text1150 -0.0280772 -0.0174864 0.0247098 -0.0211564
text1151 -0.0129991 -0.0104736 0.0087091 -0.0044850
text1152 -0.0323853 -0.0252641 0.0270136 -0.0129498
text1153 -0.0337343 -0.0336434 0.0217547 -0.0156746
text1154 -0.0307037 -0.0219175 0.0382608 -0.0194922
text1155 -0.0139452 -0.0050602 0.0113728 -0.0088114
text1156 -0.0078122 -0.0103981 0.0091591 -0.0068396
text1157 -0.0204544 -0.0189424 0.0269715 -0.0040111
text1158 -0.0194380 -0.0133049 0.0066332 0.0040570
text1159 -0.0140599 -0.0035991 0.0027866 0.0088005
text1160 -0.0125310 -0.0101652 0.0108666 0.0044738
text1161 -0.0050203 -0.0052008 0.0042657 -0.0006224
text1162 -0.0222305 -0.0343566 0.0312984 -0.0137673
text1163 -0.0221717 -0.0106390 0.0095460 0.0035989
text1164 -0.0165023 -0.0154854 0.0121898 -0.0010480
text1165 -0.0189102 -0.0131573 0.0145882 -0.0083644
text1166 -0.0209061 -0.0181060 0.0152283 0.0015395
text1167 -0.0132485 -0.0102836 0.0072522 -0.0019931
text1168 -0.0253133 -0.0087489 0.0012633 -0.0055113
text1169 -0.0263879 -0.0122597 0.0231498 0.0036716
text1170 -0.0135500 -0.0010902 0.0052918 0.0023271
text1171 -0.0307963 -0.0267395 0.0446514 -0.0326795
text1172 -0.0206176 -0.0121215 0.0117303 -0.0004180
text1173 -0.0111715 -0.0098135 0.0080941 -0.0052822
text1174 -0.0253133 -0.0087489 0.0012633 -0.0055113
text1175 -0.0263879 -0.0122597 0.0231498 0.0036716
text1176 -0.0135500 -0.0010902 0.0052918 0.0023271
text1177 -0.0307963 -0.0267395 0.0446514 -0.0326795
text1178 -0.0206176 -0.0121215 0.0117303 -0.0004180
text1179 -0.0111715 -0.0098135 0.0080941 -0.0052822
text1180 -0.0338346 -0.0004313 0.0085451 0.0042449
text1181 -0.0175080 -0.0074185 0.0188767 0.0004048
text1182 -0.0202454 -0.0070456 0.0091003 0.0057953
text1183 -0.0155649 -0.0067177 0.0129922 0.0030009
text1184 -0.0245622 -0.0164076 0.0205107 -0.0060800
text1185 -0.0272548 -0.0293760 0.0307847 -0.0016413
text1186 -0.0180257 -0.0187436 0.0205471 -0.0041048
text1187 -0.0289491 -0.0165774 0.0120793 0.0021229
text1188 -0.0117816 -0.0060870 0.0047310 0.0067572
text1189 -0.0289491 -0.0165774 0.0120793 0.0021229
text1190 -0.0117816 -0.0060870 0.0047310 0.0067572
text1191 -0.0220958 -0.0344722 0.0354198 -0.0219007
text1192 -0.0135438 -0.0213612 0.0186945 -0.0151176
text1193 -0.0155481 -0.0152616 0.0146011 -0.0042045
text1194 -0.0303074 -0.0384828 0.0401322 -0.0238906
text1195 -0.0178494 -0.0193421 0.0227931 -0.0089324
text1196 -0.0258268 -0.0187733 0.0248161 -0.0026782
text1197 -0.0125308 -0.0165875 0.0203004 -0.0128399
text1198 -0.0189107 -0.0205081 0.0142706 -0.0080079
text1199 -0.0453731 -0.0414709 0.0336012 -0.0086971
text1200 -0.0236201 -0.0223427 0.0113457 -0.0039668
text1201 -0.0355549 -0.0522757 0.0577249 -0.0257034
text1202 -0.0100708 -0.0075089 0.0060412 -0.0023116
text1203 -0.0200904 -0.0020782 0.0270178 -0.0162808
text1204 -0.0115494 -0.0032823 0.0083166 -0.0061974
text1205 -0.0158367 -0.0097220 0.0110677 -0.0041147
text1206 -0.0101677 -0.0084394 0.0024466 0.0003488
text1207 -0.0068907 -0.0051421 0.0003538 0.0034734
text1208 -0.0171843 -0.0152322 0.0181958 -0.0074204
text1209 -0.0109044 -0.0060279 0.0105468 -0.0008077
text1210 -0.0128322 -0.0081037 0.0037305 0.0037115
text1211 -0.0143948 -0.0089540 0.0045398 0.0032300
text1212 -0.0020415 -0.0002103 -0.0002492 -0.0002210
text1213 -0.0108836 -0.0110430 0.0056160 0.0015766
text1214 -0.0125390 -0.0114208 0.0094343 -0.0020678
text1215 -0.0087263 -0.0049167 0.0022342 0.0018564
text1216 -0.0257589 -0.0227024 0.0352722 -0.0137017
text1217 -0.0408130 -0.0465370 0.0525037 -0.0341073
text1218 -0.0370305 -0.0422787 0.0381864 -0.0178016
text1219 -0.0232201 -0.0203957 0.0137902 -0.0119432
text1220 -0.0308401 -0.0193076 0.0254629 0.0021493
text1221 -0.0084205 -0.0087235 0.0119914 -0.0032318
text1222 -0.0320554 -0.0361506 0.0418538 -0.0307169
text1223 -0.0105421 -0.0118383 0.0178973 -0.0135633
text1224 -0.0237996 -0.0187240 0.0156239 0.0000426
text1225 -0.0289078 -0.0253283 0.0181026 0.0008869
text1226 -0.0193353 -0.0248096 0.0130660 -0.0009890
text1227 -0.0234165 -0.0221570 0.0273415 -0.0029245
text1228 -0.0292630 -0.0318912 0.0246364 -0.0046825
text1229 -0.0185570 -0.0284533 0.0274355 -0.0115885
text1230 -0.0253651 -0.0223118 0.0169288 -0.0058940
text1231 -0.0313252 -0.0175566 0.0250452 -0.0021975
text1232 -0.0087841 -0.0063179 0.0077797 0.0010684
text1233 -0.0313854 -0.0446784 0.0192770 -0.0115624
text1234 -0.0269495 -0.0430686 0.0382685 -0.0222427
text1235 -0.0348015 -0.0354458 0.0260897 -0.0047621
text1236 -0.0225520 -0.0231180 0.0100001 -0.0063650
text1237 -0.0245876 -0.0147993 0.0123905 0.0037193
text1238 -0.0194184 -0.0108746 0.0141911 0.0048471
text1239 -0.0049818 -0.0041798 -0.0024463 0.0011828
text1240 -0.0158120 -0.0188296 0.0146039 -0.0038907
text1241 -0.0183169 -0.0088038 0.0115462 -0.0022004
text1242 -0.0223971 -0.0051785 0.0122446 -0.0084637
text1243 -0.0159546 -0.0167132 0.0262412 -0.0131933
text1244 -0.0132718 -0.0094181 0.0109666 -0.0048375
text1245 -0.0018278 -0.0012224 0.0028021 -0.0017643
text1246 -0.0232929 -0.0122365 0.0127699 0.0017122
text1247 -0.0206830 -0.0113211 0.0222255 0.0026732
text1248 -0.0309174 -0.0032887 0.0265568 -0.0166240
text1249 -0.0133048 -0.0108564 0.0083389 -0.0015538
text1250 -0.0310419 -0.0202752 0.0339144 -0.0172231
text1251 -0.0154858 -0.0072497 0.0061040 0.0004323
text1252 -0.0144430 -0.0133811 0.0220287 -0.0015614
text1253 -0.0136958 -0.0077369 0.0111449 0.0063699
text1254 -0.0175222 -0.0022748 0.0076530 0.0006984
text1255 -0.0132406 -0.0099549 0.0065488 -0.0053206
text1256 -0.0122528 -0.0112212 0.0035871 -0.0006870
text1257 -0.0213554 -0.0128326 0.0225194 -0.0129965
text1258 -0.0262546 -0.0145498 0.0142721 -0.0074675
text1259 -0.0242353 -0.0235571 0.0243342 -0.0041237
text1260 -0.0105316 -0.0058008 0.0109402 0.0017899
text1261 -0.0121728 -0.0071155 0.0116553 -0.0029772
text1262 -0.0000251 -0.0000279 0.0000919 -0.0001118
text1263 -0.0377047 -0.0422378 0.0096580 -0.0202800
text1264 -0.0197934 -0.0230440 0.0076974 -0.0092711
text1265 -0.0144286 -0.0106025 0.0061444 0.0033814
text1266 -0.0248365 -0.0339680 0.0067009 -0.0114112
text1267 -0.0368971 -0.0344407 0.0076351 0.0021460
text1268 -0.0322683 -0.0064628 0.0150060 -0.0064799
text1269 -0.0227769 -0.0093380 0.0063918 0.0022550
text1270 -0.0350511 -0.0114296 0.0074299 -0.0099809
text1271 -0.0397903 -0.0177816 0.0227249 -0.0164543
text1272 -0.0319448 -0.0252019 0.0216191 -0.0023129
text1273 -0.0297903 -0.0186123 0.0097407 0.0041726
text1274 -0.0244971 -0.0130057 0.0258249 -0.0035841
text1275 -0.0419078 -0.0443082 0.0479571 -0.0216686
text1276 -0.0084714 -0.0057896 0.0068551 -0.0022203
text1277 -0.0215771 -0.0249815 0.0342694 -0.0059751
text1278 -0.0189365 -0.0148563 0.0214280 0.0025343
text1279 -0.0343544 -0.0389256 0.0434151 -0.0129715
text1280 -0.0201428 -0.0227046 0.0288973 -0.0093314
text1281 -0.0371157 -0.0384304 0.0334478 -0.0281377
text1282 -0.0244388 -0.0205729 0.0202058 -0.0146421
text1283 -0.0263357 -0.0216308 0.0215080 -0.0146242
text1284 -0.0319279 -0.0435679 0.0315295 -0.0161905
text1285 -0.0295901 -0.0306222 0.0219509 -0.0114371
text1286 -0.0118790 -0.0158113 0.0130598 -0.0045997
text1287 -0.0198203 -0.0197079 -0.0020638 -0.0019390
text1288 -0.0370017 -0.0278708 0.0315988 -0.0161806
text1289 -0.0205322 -0.0203491 0.0096171 -0.0062514
text1290 -0.0171256 -0.0166705 0.0102353 -0.0074171
text1291 -0.0186634 -0.0126053 0.0129029 -0.0022922
text1292 -0.0197515 -0.0224775 0.0105363 -0.0041338
text1293 -0.0143811 -0.0164611 0.0144214 -0.0066032
text1294 -0.0274031 -0.0361216 0.0250000 -0.0106621
text1295 -0.0173053 -0.0166382 0.0095629 -0.0007631
text1296 -0.0218776 -0.0236391 0.0122054 -0.0064423
text1297 -0.0212042 -0.0204180 0.0263196 -0.0051083
text1298 -0.0281849 -0.0341458 0.0281037 -0.0118736
text1299 -0.0324331 -0.0702302 0.0772247 -0.0432351
text1300 -0.0203909 -0.0370155 0.0406581 -0.0232335
text1301 -0.0331164 -0.0245485 0.0500120 -0.0235915
text1302 -0.0297959 -0.0326719 0.0296014 -0.0163616
text1303 -0.0256792 -0.0265764 0.0332029 -0.0160762
text1304 -0.0364908 -0.0491971 0.0529159 -0.0310092
text1305 -0.0263818 -0.0079193 0.0150449 -0.0082711
text1306 -0.0371510 -0.0415714 0.0261596 -0.0207030
text1307 -0.0192546 -0.0083204 0.0121943 -0.0030277
text1308 -0.0123517 -0.0091194 0.0102810 -0.0004446
text1309 -0.0165899 -0.0107063 -0.0036836 0.0096940
text1310 -0.0170364 -0.0077991 0.0091901 0.0045287
text1311 -0.0294660 -0.0267418 0.0297496 -0.0026375
text1312 -0.0179177 -0.0087022 0.0029880 0.0060890
text1313 -0.0229068 -0.0190110 0.0123486 -0.0007823
text1314 -0.0197625 -0.0166414 0.0039465 0.0011540
text1315 -0.0047663 -0.0047004 0.0031627 -0.0015755
text1316 -0.0175584 -0.0150533 0.0074115 0.0026292
text1317 -0.0302054 -0.0285164 0.0410343 -0.0119864
text1318 -0.0129722 -0.0127727 0.0142757 -0.0069361
text1319 -0.0141781 -0.0106203 0.0207878 0.0000727
text1320 -0.0191252 -0.0124387 0.0140555 -0.0016036
text1321 -0.0043648 -0.0032402 0.0025957 -0.0027211
text1322 -0.0212011 -0.0177168 0.0119843 -0.0025866
text1323 -0.0327393 -0.0354460 0.0288433 -0.0126210
text1324 -0.0330357 -0.0262942 0.0212871 -0.0113363
text1325 -0.0130956 -0.0116058 0.0044406 -0.0063922
text1326 -0.0262683 -0.0322135 0.0100353 -0.0179529
text1327 -0.0049794 -0.0052195 -0.0029593 -0.0014452
text1328 -0.0214682 -0.0179034 0.0145458 0.0021978
text1329 -0.0193344 -0.0191635 0.0221284 -0.0038298
text1330 -0.0418091 -0.0200834 0.0200841 0.0040169
text1331 -0.0420066 -0.0592393 0.0589938 -0.0366837
text1332 -0.0283126 -0.0408920 0.0320490 -0.0220936
text1333 -0.0322400 -0.0275556 0.0088936 0.0025147
text1334 -0.0069879 0.0066040 -0.0026317 -0.0019740
text1335 -0.0159419 0.0013483 -0.0028675 -0.0026103
text1336 -0.0106301 -0.0068354 -0.0061622 0.0076143
text1337 -0.0330354 -0.0119123 0.0132233 0.0024219
text1338 -0.0229965 -0.0099492 0.0163584 0.0101580
text1339 -0.0040152 -0.0052168 0.0041046 -0.0010303
text1340 -0.0215651 -0.0228215 0.0162005 -0.0001731
text1341 -0.0085816 -0.0081415 -0.0014404 0.0019583
text1342 -0.0304100 -0.0245165 0.0194532 -0.0120615
text1343 -0.0078934 0.0005730 -0.0087319 -0.0010338
text1344 -0.0175647 -0.0056119 0.0069379 0.0053781
text1345 -0.0112575 -0.0036575 0.0041023 0.0032193
text1346 -0.0083523 -0.0049881 -0.0003655 0.0023757
text1347 -0.0050436 0.0058937 -0.0007687 -0.0013930
text1348 -0.0199133 -0.0241320 0.0191006 -0.0060704
text1349 -0.0182670 -0.0168302 0.0203179 0.0003837
text1350 -0.0253510 -0.0202274 0.0228139 0.0030059
text1351 -0.0148204 -0.0153495 0.0098272 -0.0029220
text1352 -0.0106957 -0.0087951 -0.0023483 -0.0002370
text1353 -0.0236027 -0.0180741 0.0200338 0.0004266
text1354 -0.0123879 -0.0023628 0.0025544 0.0048047
text1355 -0.0127741 -0.0071121 0.0117839 -0.0072104
text1356 -0.0157944 -0.0188414 0.0238667 -0.0031225
text1357 -0.0138435 -0.0169481 0.0225248 -0.0061239
text1358 -0.0374851 -0.0345558 0.0147089 -0.0104329
text1359 -0.0156332 -0.0173687 0.0157866 -0.0090199
text1360 -0.0238672 -0.0309938 0.0184098 -0.0136054
text1361 -0.0278060 -0.0294771 0.0018664 -0.0108769
text1362 -0.0108054 -0.0167107 0.0105743 -0.0070844
text1363 -0.0162729 -0.0149582 0.0149370 -0.0052254
text1364 -0.0169485 -0.0082989 0.0037341 0.0063032
text1365 -0.0255821 -0.0259972 0.0138923 -0.0039673
text1366 -0.0139558 -0.0141919 0.0110847 0.0002414
text1367 -0.0172765 -0.0175659 0.0194921 -0.0102851
text1368 -0.0108898 -0.0188032 0.0233461 -0.0085326
text1369 -0.0046710 -0.0079571 0.0113909 -0.0039857
text1370 -0.0270755 -0.0427760 0.0632609 -0.0284076
text1371 -0.0209777 -0.0115480 0.0271077 -0.0166764
text1372 -0.0083622 -0.0129381 0.0155152 -0.0115692
text1373 -0.0324820 -0.0141705 0.0094045 0.0039952
text1374 -0.0179198 -0.0124267 0.0135126 -0.0032875
text1375 -0.0250306 -0.0240338 0.0201811 -0.0018895
text1376 -0.0294281 -0.0239447 0.0234034 0.0012512
text1377 -0.0296595 -0.0149764 0.0203755 -0.0007243
text1378 -0.0011634 -0.0007094 0.0001343 -0.0004430
text1379 -0.0210614 -0.0186492 0.0166222 0.0067423
text1380 -0.0192499 -0.0114692 0.0027979 0.0026075
text1381 -0.0263224 -0.0130773 0.0024896 0.0070165
text1382 -0.0282173 -0.0255885 0.0290016 -0.0041299
text1383 -0.0090403 -0.0068360 0.0050652 -0.0003802
text1384 -0.0160612 -0.0135646 0.0094146 0.0046550
text1385 -0.0224823 -0.0084191 0.0065262 0.0005661
text1386 -0.0252930 -0.0191878 0.0156297 -0.0000676
text1387 -0.0200779 -0.0188203 0.0127996 0.0014429
text1388 -0.0209746 -0.0158269 0.0078984 0.0069744
text1389 -0.0180291 -0.0128333 0.0005124 0.0053052
text1390 -0.0121053 -0.0066820 0.0057504 0.0020275
text1391 -0.0453184 -0.0289202 0.0314715 -0.0259590
text1392 -0.0253356 -0.0356443 0.0355079 -0.0215467
text1393 -0.0300863 -0.0329645 0.0431182 -0.0164955
text1394 -0.0346267 -0.0497580 0.0621428 -0.0218888
text1395 -0.0302935 -0.0337548 0.0352059 -0.0144010
text1396 -0.0280730 -0.0201772 0.0277154 0.0019051
text1397 -0.0169138 -0.0077611 0.0253478 0.0008448
text1398 -0.0239955 -0.0158739 0.0318515 0.0005385
text1399 -0.0182385 -0.0123515 0.0161897 0.0026435
text1400 -0.0149582 -0.0085011 0.0089844 0.0003615
text1401 -0.0189402 -0.0060880 0.0000486 0.0036233
text1402 -0.0134178 -0.0045263 0.0080436 -0.0004206
text1403 -0.0101801 -0.0050896 0.0116766 0.0085836
text1404 -0.0313735 -0.0105271 0.0207052 0.0106364
text1405 -0.0255722 -0.0175560 0.0224601 -0.0073196
text1406 -0.0245900 -0.0090380 0.0043929 0.0093472
text1407 -0.0221980 -0.0192938 0.0212631 -0.0008437
text1408 -0.0251982 -0.0240122 0.0420521 -0.0261952
text1409 -0.0203148 -0.0332502 0.0439579 -0.0239312
text1410 -0.0159374 -0.0197010 0.0208139 -0.0130146
text1411 -0.0424525 -0.0414119 0.0613792 -0.0212083
text1412 -0.0186321 -0.0237309 0.0287905 -0.0012285
text1413 -0.0319115 -0.0350255 0.0422151 -0.0165795
text1414 -0.0179175 -0.0153722 0.0222230 -0.0108068
text1415 -0.0319043 -0.0165855 0.0289279 -0.0061888
text1416 -0.0405396 0.1034778 -0.0073124 -0.0888108
text1417 -0.0283061 0.0478516 0.0110753 -0.0490471
text1418 -0.0429285 0.0910602 0.0104452 -0.0723139
text1419 -0.0366631 0.0706994 -0.0092534 -0.0669935
text1420 -0.0100288 0.0308001 -0.0001222 -0.0126220
text1421 -0.0405396 0.1034778 -0.0073124 -0.0888108
text1422 -0.0283061 0.0478516 0.0110753 -0.0490471
text1423 -0.0429285 0.0910602 0.0104452 -0.0723139
text1424 -0.0366631 0.0706994 -0.0092534 -0.0669935
text1425 -0.0100288 0.0308001 -0.0001222 -0.0126220
text1426 -0.0138540 -0.0064632 0.0021442 -0.0032020
text1427 -0.0187671 -0.0166677 0.0052083 0.0048837
text1428 -0.0188015 -0.0161834 0.0062781 -0.0054609
text1429 -0.0128209 -0.0171851 0.0146064 -0.0113797
text1430 -0.0263307 -0.0275192 0.0145038 -0.0019497
text1431 -0.0258309 -0.0305303 0.0226093 -0.0095027
text1432 -0.0229858 -0.0188972 0.0228609 -0.0017898
text1433 -0.0085748 -0.0095649 0.0088374 -0.0038393
text1434 -0.0335303 -0.0160602 0.0127772 0.0113703
text1435 -0.0155297 -0.0107261 0.0137074 0.0049370
text1436 -0.0229171 -0.0300761 0.0207464 -0.0097378
text1437 -0.0241654 -0.0261816 0.0153819 -0.0017239
text1438 -0.0489894 -0.0481185 0.0289188 -0.0169174
text1439 -0.0250466 -0.0154327 0.0117028 -0.0065938
text1440 -0.0520046 -0.0583742 0.0662490 -0.0504839
text1441 -0.0334258 -0.0393509 0.0371494 -0.0235328
text1442 -0.0327471 -0.0393492 0.0406206 -0.0202905
text1443 -0.0277549 -0.0257793 0.0298059 -0.0185247
text1444 -0.0269057 -0.0249537 0.0342853 -0.0057383
text1445 -0.0398505 -0.0400880 0.0362299 -0.0203891
text1446 -0.0322838 -0.0288647 0.0217970 -0.0087234
text1447 -0.0168690 -0.0186688 0.0209756 -0.0132781
text1448 -0.0172030 -0.0046078 0.0008116 0.0049950
text1449 -0.0207689 -0.0115445 0.0116389 0.0012889
text1450 -0.0146050 -0.0106660 0.0073105 0.0028472
text1451 -0.0090539 -0.0066361 0.0060523 -0.0021878
text1452 -0.0054197 0.0031217 0.0024780 -0.0019646
text1453 -0.0225591 -0.0431354 0.0494379 -0.0250713
text1454 -0.0398156 -0.0359628 0.0492129 -0.0060549
text1455 -0.0111392 -0.0185834 0.0233839 -0.0095273
text1456 -0.0283824 -0.0605737 0.0721800 -0.0341658
text1457 -0.0048972 -0.0133059 0.0162561 -0.0070454
text1458 -0.0280098 -0.0185220 0.0092738 0.0040364
text1459 -0.0258793 -0.0131391 0.0180048 0.0078444
text1460 -0.0222697 -0.0190342 0.0231783 -0.0136937
text1461 -0.0311645 -0.0079515 0.0039554 -0.0004442
text1462 -0.0254623 -0.0215802 0.0097058 -0.0054261
text1463 -0.0065563 -0.0029828 -0.0001848 0.0011721
text1464 -0.0122730 -0.0092948 0.0032985 -0.0007902
text1465 -0.0398933 -0.0240134 0.0226183 0.0076573
text1466 -0.0182342 -0.0106780 0.0117423 -0.0024338
text1467 -0.0296004 -0.0191827 0.0214830 -0.0001373
text1468 -0.0717713 -0.0122703 0.0406497 -0.0389009
text1469 -0.0048696 -0.0011797 0.0014782 -0.0006533
text1470 -0.0268300 -0.0175015 0.0134075 -0.0086325
text1471 -0.0117531 -0.0020362 0.0015143 0.0032228


The Topic strength table below represents the strength of each dimension.

CN <- c("dimension1","dimension2","dimension3","dimension4")
Topic_Strength <- data.frame(CN,TED.lsa$sk)
colnames(Topic_Strength) <- c("Dimension", "Topic strength")

head(Topic_Strength) %>% flextable() %>% add_header_lines("Topic strength by dimension") %>% autofit()


The Terms-topic sim. table below shows the link between each term and each dimension. For example, the term “artificial” is most relevant to dimension 2.

kable(head(TED.lsa$features,10), 
      col.names = c("dimension1","dimension2","dimension3","dimension4"),
      caption = "Terms-topic sim.(LSA on TF)") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
Terms-topic sim.(LSA on TF)
dimension1 dimension2 dimension3 dimension4
today -0.0569966 0.0126785 -0.0485503 -0.0155983
artificial -0.0303165 0.0683780 -0.0026655 -0.0039735
intelligence -0.0516758 0.1320332 0.0036350 -0.0102737
help -0.0253760 0.0020966 0.0032362 -0.0121982
doctor -0.0138459 0.0036630 0.0071495 -0.0041132
diagnose -0.0058367 0.0095565 0.0023165 -0.0030806
patient -0.0130458 0.0135236 0.0042128 -0.0092575
pilot -0.0023414 0.0010596 -0.0031627 0.0018573
fly -0.0104628 0.0055722 -0.0042352 0.0232152
commercial -0.0024783 0.0036116 -0.0043273 -0.0011694


The first dimension of LSA is often correlated with the document length and the frequency of the term. This phenomenon can be visualized through the construction of a scatter plot between the document length and the first dimension of the latent semantic space.

doc.freq <- ntoken(TED.tk) # row-sum of the DTM.
data.frame(doc.freq,
           dim1 = TED.lsa$docs[, 1]) %>% 
  ggplot(aes(doc.freq, dim1)) + 
  geom_point() + 
  geom_smooth(method="lm",
              formula = 'y ~ x') +
  labs(
    title="The relationship between the number of tokens in documents and the values in LSA dimension 1",
    x="Number of tokens",
    y="LSA dim. 1"
  )

We then examine the top words in dimension 2, 3, and 4. For each dimension, we look at the five terms with the largest values and the five ones with the lowest values (i.e., largest negative value).

According to the table below, Dimension 2 is associated positively with “ai”, “human”, “robot”, “machine” ,“datum”, and negatively associated with “feel”, “climate”, “life”, “love”, “people”.

n.terms <- 5
## For Dimension 2
w.order <- sort(TED.lsa$features[, 2],decreasing = TRUE)
w.top.d2 <- c(w.order[1:n.terms],rev(rev(w.order)[1:n.terms]))
## For Dimension 3
w.order <- sort(TED.lsa$features[, 3], decreasing = TRUE)
w.top.d3 <- c(w.order[1:n.terms], rev(rev(w.order)[1:n.terms]))
## For Dimension 4
w.order <- sort(TED.lsa$features[,4], decreasing = TRUE)
w.top.d4 <- c(w.order[1:n.terms], rev(rev(w.order)[1:n.terms]))

w.top.d2 <- t(w.top.d2)
kable(w.top.d2, 
      caption = "Top 5 and bottom 5 of Dimension2 (LSA on TF)") %>%
   kable_paper()
Top 5 and bottom 5 of Dimension2 (LSA on TF)
ai human robot machine datum feel climate life love people
0.5115474 0.394889 0.195335 0.1781869 0.1514245 -0.1080379 -0.11336 -0.1259553 -0.2119132 -0.2781561


The below table shows that Dimension 3 is associated positively with “people”, “love”, “robot”, “feel” ,“life”, and negatively associated with “forest”, “year”, “energy”, “carbon”, “climate”.

w.top.d3 <- t(w.top.d3)
kable(w.top.d3, 
      caption = "Top 5 and bottom 5 of Dimension3 (LSA on TF)") %>%
   kable_paper()
Top 5 and bottom 5 of Dimension3 (LSA on TF)
people love robot feel life forest year energy carbon climate
0.2709439 0.2628304 0.1889753 0.1307637 0.1043617 -0.152073 -0.1816735 -0.1888159 -0.1949385 -0.2867198


The below table shows that Dimension 4 is associated positively with “robot”, “thing”, “rule”, “move” ,“start”, and negatively associated with “datum”, “human”, “love”, “people”, “ai”.

w.top.d4 <- t(w.top.d4)
kable(w.top.d4, 
      caption = "Top 5 and bottom 5 of Dimension4 (LSA on TF)") %>%
   kable_paper() 
Top 5 and bottom 5 of Dimension4 (LSA on TF)
robot thing rule move start datum human love people ai
0.7714658 0.1265223 0.1009858 0.093242 0.073076 -0.0920711 -0.1281984 -0.1326988 -0.1940262 -0.3584263


In order to confirm the relation between LSA and categories of text, we combine the LSA result with the categories of text and represent every text on these two following plots.

TED.lsa.source <- TED_full %>% 
  select(2) %>% cbind(as.data.frame(TED.lsa$docs))

LSA_p1 <- ggplot(data=TED.lsa.source,mapping = aes(
  x=V2,
  y=V3,
  color=cate))+
  geom_point()+
  labs(x = "dimension2",
       y = "dimension3",
       title = "Distribution of texts in different categories",
       subtitle = "LSA(TF) dimension 2 and 3")+
  scale_colour_discrete(
    name="Category",
    breaks=c("1","2","3"),
    labels=c("AI","Climate change","Relationships")
  )+
  theme(plot.title = element_text(size = 12))

LSA_P2 <- ggplot(data=TED.lsa.source,mapping = aes(
  x=V3,
  y=V4,
  color=cate))+
  geom_point()+
  labs(x = "dimension3",
       y = "dimension4",
       title = "Distribution of texts in different categories",
       subtitle = "LSA(TF):dimension 3 and 4")+
  scale_colour_discrete(
    name="Category",
    breaks=c("1","2","3"),
    labels=c("AI","Climate change","Relationships")
  )+
  theme(plot.title = element_text(size = 12))

(LSA_p1+LSA_P2)+
  plot_layout(guides = "collect") & theme(legend.position = 'bottom')

From the left hand side plot, x-axis represents dimension 2 and y-axis represents dimension3. According to this plot, the “Climate change” category is negatively associated with dimension3 while, the “Relationships” category is positively associated with dimension3. Additionally, the “AI” category is positively associated with dimension2.

For the right hand side plot, x-axis represents dimension 3 and y-axis represents dimension4. According to this plot, the “Climate change” and the “Relationships” categories seem to be not associated with dimension4. However, the “AI” category is associated with dimension 4, although there is some variability in the direction of this association.

6.1.2 LSA on TF-IDF

We also perform the LSA with the TF-IDF matrix to examine whether the weighted frequency can improve the interpretation.

Regarding the Doc-topic sim. table, which demonstrates the relationship between each text and each dimension. For example, text3 is most relevant to dimension 3.

TED.lsa2 <- textmodel_lsa(TED.tfidf, nd = 4) 

kable(TED.lsa2$docs, 
      col.names = c("dimension1","dimension2","dimension3","dimension4"),
      caption = "Doc-topic sim.(LSA on TF-IDF)") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
Doc-topic sim.(LSA on TF-IDF)
dimension1 dimension2 dimension3 dimension4
text1 -0.0422496 -0.0329279 -0.0320857 -0.0394412
text2 -0.0275697 -0.0224974 -0.0249187 -0.0308737
text3 -0.0352744 -0.0394390 -0.0573503 -0.0317769
text4 -0.0387940 -0.0440947 -0.0690456 -0.0479835
text5 -0.0425726 -0.0337124 -0.0456903 -0.0725318
text6 -0.0213413 -0.0274721 -0.0331742 -0.0599822
text7 -0.0372246 -0.0300740 -0.0267797 -0.0323277
text8 -0.0367941 -0.0212970 -0.0162914 -0.0250740
text9 -0.0363915 -0.0263333 -0.0233207 -0.0359802
text10 -0.0308465 -0.0253658 -0.0140726 -0.0261814
text11 -0.0402360 -0.0229583 -0.0268662 -0.0377900
text12 -0.0375956 -0.0272690 -0.0224744 -0.0247567
text13 -0.0245463 -0.0025653 -0.0091655 -0.0169218
text14 -0.0252603 -0.0124116 -0.0122370 -0.0152749
text15 -0.0001732 -0.0000771 0.0001284 0.0000621
text16 -0.0308109 -0.0184768 -0.0131101 -0.0031044
text17 -0.0214623 -0.0090715 0.0034872 -0.0065740
text18 -0.0271418 -0.0101689 0.0100212 0.0053525
text19 -0.0211285 -0.0092654 0.0079649 -0.0076075
text20 -0.0156512 -0.0071335 0.0095319 -0.0009809
text21 -0.0174472 -0.0060270 0.0004573 -0.0039779
text22 -0.0246188 -0.0163992 -0.0185785 -0.0164401
text23 -0.0247240 -0.0173729 -0.0138138 -0.0114032
text24 -0.0288989 -0.0253290 -0.0306215 0.0164976
text25 -0.0417036 -0.0440674 -0.0693023 0.0400973
text26 -0.0326873 -0.0129338 -0.0241456 -0.0097452
text27 -0.0025679 -0.0001549 0.0000837 -0.0003927
text28 -0.0220737 -0.0109435 -0.0171714 -0.0223362
text29 -0.0323942 -0.0324238 -0.0258192 -0.0386713
text30 -0.0328217 0.0069878 -0.0199073 -0.0178160
text31 -0.0254942 -0.0209568 -0.0458023 -0.0111945
text32 -0.0217534 -0.0179579 -0.0255266 -0.0251244
text33 -0.0077153 -0.0024128 -0.0054185 -0.0120708
text34 -0.0285262 -0.0315279 -0.0226315 -0.0496563
text35 -0.0232184 -0.0166659 -0.0237707 -0.0301823
text36 -0.0262233 -0.0230341 -0.0085872 -0.0250824
text37 -0.0304031 -0.0265574 -0.0262080 -0.0561061
text38 -0.0205409 -0.0133237 -0.0154653 -0.0080555
text39 -0.0152019 -0.0066998 -0.0012370 -0.0046716
text40 -0.0244961 -0.0150788 -0.0092616 -0.0227058
text41 -0.0250216 -0.0178983 -0.0075760 -0.0134588
text42 -0.0213407 -0.0161356 -0.0025263 -0.0153482
text43 -0.0154167 -0.0081825 0.0018929 -0.0034267
text44 -0.0315698 -0.0094676 0.0004606 -0.0182212
text45 -0.0234959 -0.0055520 -0.0118687 -0.0179219
text46 -0.0038574 -0.0021085 -0.0032150 -0.0045033
text47 -0.0476029 -0.0299750 -0.0258681 -0.0646642
text48 -0.0457408 0.0006985 -0.0353752 -0.0467143
text49 -0.0420311 -0.0328334 0.0375300 -0.0268927
text50 -0.0230294 -0.0263774 -0.0196203 -0.0488351
text51 -0.0165408 0.0013677 0.0100517 -0.0021014
text52 -0.0192817 -0.0033588 0.0171124 0.0007307
text53 -0.0176920 -0.0025925 0.0071863 -0.0002465
text54 -0.0448211 -0.0118129 -0.0146718 -0.0466950
text55 -0.0185272 -0.0025597 0.0002995 -0.0077944
text56 -0.0275046 -0.0021669 0.0001981 -0.0069063
text57 -0.0002411 -0.0000942 0.0001310 -0.0000092
text58 -0.0382946 -0.0163995 0.0007255 -0.0226702
text59 -0.0165774 -0.0028263 0.0003179 -0.0076998
text60 -0.0207104 0.0019542 -0.0035941 -0.0007140
text61 -0.0345667 0.0412849 -0.0187940 -0.0068525
text62 -0.0304427 0.0326191 -0.0124245 0.0022325
text63 -0.0359581 0.0165657 -0.0183056 0.0000682
text64 -0.0349514 -0.0233243 -0.0301906 -0.0264169
text65 -0.0351091 -0.0107101 -0.0258786 -0.0098813
text66 -0.0285421 -0.0129534 -0.0129755 -0.0117214
text67 -0.0430634 -0.0538184 -0.0940910 0.0790135
text68 -0.0333620 -0.0360427 -0.0390475 0.0500353
text69 -0.0358999 -0.0125041 -0.0450806 0.0171264
text70 -0.0265458 -0.0180285 -0.0125030 -0.0096695
text71 -0.0330347 -0.0191165 -0.0152974 -0.0107520
text72 -0.0259106 -0.0223942 -0.0080558 -0.0175451
text73 -0.0167782 -0.0020201 -0.0150208 -0.0001281
text74 -0.0274838 0.0321909 -0.0163581 0.0081847
text75 -0.0376141 -0.0002969 -0.0192198 0.0011115
text76 -0.0253981 0.0020801 -0.0105971 -0.0004877
text77 -0.0328547 -0.0082745 -0.0107545 0.0038368
text78 -0.0235799 0.0076515 -0.0092186 0.0043661
text79 -0.0426763 -0.0026489 -0.0426432 0.0082036
text80 -0.0340554 0.0012138 -0.0184029 -0.0121575
text81 -0.0288114 0.0061969 -0.0043946 -0.0067574
text82 -0.0336190 -0.0060698 -0.0074208 -0.0104399
text83 -0.0107581 0.0024605 -0.0029071 -0.0008949
text84 -0.0359517 -0.0290342 -0.0052703 -0.0013052
text85 -0.0268422 -0.0230854 -0.0014778 -0.0046206
text86 -0.0379264 -0.0137758 -0.0045471 -0.0040840
text87 -0.0264541 -0.0164133 -0.0107138 -0.0100730
text88 -0.0018708 -0.0026435 0.0007437 0.0020845
text89 -0.0345333 -0.0371732 -0.0134869 -0.0137677
text90 -0.0210573 -0.0139768 0.0080108 0.0058116
text91 -0.0245812 -0.0177719 -0.0045580 -0.0059027
text92 -0.0176004 -0.0096606 0.0106012 0.0062227
text93 -0.0100241 -0.0094101 0.0066199 -0.0008196
text94 -0.0153538 -0.0094816 0.0117699 0.0045340
text95 -0.0198159 -0.0096780 0.0115536 0.0063478
text96 -0.0169387 -0.0166280 0.0020732 -0.0084961
text97 -0.0278591 -0.0283500 -0.0127402 -0.0236465
text98 -0.0308987 -0.0272103 -0.0102840 -0.0246003
text99 -0.0305988 -0.0322530 -0.0182695 -0.0315139
text100 -0.0084314 -0.0068559 -0.0072867 -0.0061409
text101 -0.0155980 -0.0154976 -0.0196662 0.0254398
text102 -0.0302287 -0.0355006 -0.0387779 0.0336160
text103 -0.0214384 -0.0286247 -0.0437458 0.0628185
text104 -0.0321533 -0.0165488 -0.0170045 -0.0089346
text105 -0.0553076 -0.0457262 -0.0707497 -0.1210271
text106 -0.0266575 -0.0067721 -0.0133371 -0.0229789
text107 -0.0483565 -0.0360432 -0.0542502 -0.0968743
text108 -0.0441772 -0.0179332 -0.0185173 -0.0451624
text109 -0.0030654 -0.0013810 0.0009570 -0.0010292
text110 -0.0244695 -0.0019428 0.0085860 -0.0066194
text111 -0.0378921 -0.0145546 -0.0277322 -0.0047162
text112 -0.0263278 -0.0144632 -0.0124616 -0.0023016
text113 -0.0230602 -0.0120708 -0.0104209 -0.0205510
text114 -0.0217847 -0.0116670 -0.0009090 0.0055827
text115 -0.0306673 -0.0088612 -0.0106969 -0.0221998
text116 -0.0197712 -0.0072945 -0.0067561 -0.0202252
text117 -0.0184897 -0.0068088 0.0121309 -0.0003057
text118 -0.0209115 -0.0091848 -0.0038574 -0.0082758
text119 -0.0216810 -0.0118997 0.0063949 0.0026014
text120 -0.0141134 -0.0106713 0.0047271 -0.0025073
text121 -0.0528632 -0.0329710 -0.0069018 -0.0388113
text122 -0.0184443 -0.0122901 -0.0023863 -0.0188012
text123 -0.0250909 -0.0209522 -0.0137996 -0.0245978
text124 -0.0553400 -0.0271457 -0.0204519 -0.0412997
text125 -0.0411523 -0.0175170 -0.0108046 -0.0263564
text126 -0.0079568 -0.0066342 -0.0019126 -0.0065682
text127 -0.0355137 0.0015610 -0.0127652 -0.0087945
text128 -0.0155297 0.0021304 -0.0057396 0.0045520
text129 -0.0496970 -0.0278353 -0.0357213 -0.0091834
text130 -0.0358097 -0.0125989 -0.0194836 -0.0118394
text131 -0.0433647 -0.0168190 -0.0234387 -0.0133293
text132 -0.0226695 -0.0035539 0.0046253 -0.0057726
text133 -0.0280413 -0.0063374 -0.0084255 -0.0143910
text134 -0.0460359 -0.0279474 -0.0199764 -0.0365764
text135 -0.0295928 0.0027573 -0.0011455 -0.0113531
text136 -0.0320757 -0.0011826 -0.0037077 -0.0106186
text137 -0.0411163 -0.0067814 -0.0324854 0.0039856
text138 -0.0060010 -0.0020034 -0.0020123 -0.0032373
text139 -0.0242676 -0.0096178 -0.0120453 -0.0143593
text140 -0.0339413 -0.0187144 -0.0169736 -0.0272028
text141 -0.0169068 -0.0041471 -0.0084726 -0.0196592
text142 -0.0273048 0.0437190 -0.0115942 0.0001823
text143 -0.0243373 0.0050499 0.0016880 0.0049187
text144 -0.0225731 -0.0109336 0.0030716 -0.0033375
text145 -0.0258518 -0.0135409 0.0040152 -0.0035109
text146 -0.0214490 -0.0003692 -0.0063649 -0.0066919
text147 -0.0059924 -0.0001775 -0.0007352 -0.0029434
text148 -0.0212526 -0.0068283 -0.0034633 -0.0128749
text149 -0.0252826 -0.0185539 -0.0201771 -0.0121654
text150 -0.0189356 -0.0079871 -0.0074768 -0.0140191
text151 -0.0342176 -0.0159925 -0.0180575 -0.0295914
text152 -0.0218288 -0.0077337 -0.0063248 -0.0118541
text153 -0.0272256 -0.0141420 -0.0154840 -0.0293550
text154 -0.0233409 0.0108965 -0.0073353 -0.0019573
text155 -0.0239026 -0.0299776 -0.0465232 0.0686051
text156 -0.0323097 -0.0588135 -0.0828277 0.1296122
text157 -0.0369880 -0.0479545 -0.0769620 0.1105174
text158 -0.0005596 -0.0008190 -0.0001413 0.0007490
text159 -0.0135582 0.0049557 0.0043559 0.0001592
text160 -0.0139801 0.0100892 -0.0004842 0.0019604
text161 -0.0186958 -0.0095949 -0.0114481 -0.0201409
text162 -0.0220879 -0.0243950 -0.0129959 -0.0288409
text163 -0.0245168 -0.0209255 -0.0228963 -0.0457514
text164 -0.0186017 -0.0111364 0.0036876 -0.0044178
text165 -0.0123882 -0.0040405 0.0009014 -0.0041077
text166 -0.0125051 -0.0035903 0.0050714 -0.0003199
text167 -0.0141532 -0.0071592 0.0153364 0.0021595
text168 -0.0021862 -0.0023575 0.0048938 0.0008592
text169 -0.0581031 -0.0243685 -0.0728200 -0.1291666
text170 -0.0483682 -0.0205083 -0.0760620 -0.1333498
text171 -0.0526862 -0.0375945 -0.0691036 -0.1297437
text172 -0.0214071 -0.0148342 -0.0302543 -0.0512241
text173 -0.0286016 -0.0393821 -0.0581057 0.0898401
text174 -0.0421514 -0.0460745 -0.0612038 0.0978455
text175 -0.0283336 -0.0295660 -0.0113124 0.0352703
text176 -0.0328253 -0.0501779 -0.0776684 0.1012764
text177 -0.0277876 -0.0440216 -0.0688854 0.0773207
text178 -0.0100081 -0.0120424 -0.0125501 0.0133010
text179 -0.0594941 -0.0557857 -0.0517681 -0.0170689
text180 -0.0316960 -0.0147185 -0.0157722 0.0092686
text181 -0.0203737 0.0019120 0.0094904 -0.0046588
text182 -0.0146067 -0.0079428 -0.0066181 -0.0114870
text183 -0.0183836 -0.0135888 0.0017466 -0.0045810
text184 -0.0310937 -0.0314271 -0.0319630 0.0018764
text185 -0.0301119 -0.0210522 -0.0194232 -0.0236423
text186 -0.0265107 -0.0171946 -0.0272832 0.0051195
text187 -0.0016259 -0.0009309 0.0005541 0.0003412
text188 -0.0410636 -0.0373388 -0.0323780 -0.0396440
text189 -0.0251457 -0.0175584 -0.0109751 -0.0190943
text190 -0.0241736 -0.0179347 -0.0030616 -0.0106330
text191 -0.0031988 -0.0020848 0.0002399 -0.0012003
text192 -0.0307658 -0.0169352 -0.0254998 -0.0158452
text193 -0.0422766 -0.0289711 -0.0337097 -0.0344205
text194 -0.0290523 -0.0235478 -0.0188545 -0.0230500
text195 -0.0243519 -0.0224231 -0.0041410 -0.0170766
text196 -0.0352092 -0.0232527 -0.0323703 -0.0425330
text197 -0.0339494 -0.0116640 -0.0165822 -0.0264156
text198 -0.0167808 0.0015417 -0.0043442 -0.0078236
text199 -0.0193532 -0.0078703 0.0104935 0.0070973
text200 -0.0179859 -0.0127310 0.0084789 0.0023689
text201 -0.0247454 -0.0136181 -0.0055535 -0.0059973
text202 -0.0183746 -0.0116094 0.0004419 -0.0061446
text203 -0.0259193 -0.0213849 -0.0133899 0.0040419
text204 -0.0170592 -0.0137222 0.0075175 -0.0014403
text205 -0.0165547 -0.0028364 0.0052890 -0.0059923
text206 -0.0200894 -0.0063188 0.0192950 0.0045390
text207 -0.0208254 -0.0099657 0.0047065 -0.0040495
text208 -0.0105741 -0.0062462 0.0042699 -0.0026006
text209 -0.0463493 -0.0273675 -0.0576191 -0.1109363
text210 -0.0538506 0.0055640 -0.0315286 -0.0909411
text211 -0.0279633 -0.0050522 -0.0183756 -0.0406755
text212 -0.0281167 -0.0153242 -0.0067552 -0.0116350
text213 -0.0368608 -0.0289684 -0.0190621 -0.0141507
text214 -0.0429777 -0.0177308 -0.0205200 -0.0371610
text215 -0.0511241 -0.0387311 -0.0224895 -0.0418389
text216 -0.0360945 -0.0363549 -0.0206577 -0.0085738
text217 -0.0236515 -0.0110365 0.0040819 -0.0024253
text218 -0.0201873 -0.0126703 0.0012531 -0.0031398
text219 -0.0400465 -0.0020562 -0.0009094 -0.0010607
text220 -0.0243098 -0.0070300 0.0081210 0.0029688
text221 -0.0168521 -0.0064033 0.0016682 -0.0087800
text222 -0.0247187 -0.0169238 -0.0010488 -0.0127382
text223 -0.0308813 -0.0291733 -0.0140609 -0.0149049
text224 -0.0190941 -0.0108555 -0.0089142 -0.0095129
text225 -0.0172604 -0.0078267 0.0017682 -0.0027785
text226 -0.0259189 -0.0002044 -0.0007589 -0.0042096
text227 -0.0183352 -0.0070793 0.0025623 -0.0078294
text228 -0.0365834 -0.0092720 -0.0008553 -0.0154160
text229 -0.0251695 -0.0075854 -0.0038987 -0.0105789
text230 -0.0233616 -0.0016702 -0.0052316 -0.0166388
text231 -0.0085763 -0.0040861 0.0040382 -0.0025586
text232 -0.0373393 -0.0361978 -0.0225761 -0.0332773
text233 -0.0272438 -0.0326019 -0.0161525 -0.0139339
text234 -0.0249118 -0.0235845 -0.0086703 -0.0167931
text235 -0.0312732 -0.0275581 -0.0198023 -0.0248023
text236 -0.0159408 -0.0115990 -0.0075703 -0.0207472
text237 -0.0258005 -0.0181538 -0.0079214 -0.0198984
text238 -0.0243192 -0.0166850 -0.0183418 -0.0260778
text239 -0.0199363 -0.0142079 -0.0063698 -0.0197031
text240 -0.0235569 -0.0126220 0.0053331 -0.0130550
text241 -0.0309598 -0.0204414 0.0035421 -0.0231254
text242 -0.0187097 -0.0105601 0.0056425 -0.0069792
text243 -0.0281958 -0.0139765 -0.0122061 -0.0193861
text244 -0.0082881 -0.0078803 -0.0032054 -0.0101782
text245 -0.0294672 -0.0079816 -0.0056865 0.0012802
text246 -0.0297295 0.0088182 -0.0036524 0.0092163
text247 -0.0255932 0.0105614 -0.0072916 0.0102802
text248 -0.0223820 0.0009284 -0.0030496 0.0075253
text249 -0.0234716 0.0005688 -0.0035414 0.0040829
text250 -0.0305979 0.0018773 0.0098079 0.0068185
text251 -0.0174806 0.0045174 0.0023812 0.0029690
text252 -0.0236933 -0.0338213 -0.0732836 0.0982147
text253 -0.0269388 -0.0384700 -0.0793850 0.1141854
text254 -0.0282683 -0.0460974 -0.0908728 0.1282149
text255 -0.0284676 -0.0460368 -0.0815370 0.1238795
text256 -0.0281620 -0.0500731 -0.0947602 0.1410089
text257 -0.0351883 -0.0485965 -0.0837899 0.1003734
text258 -0.0114792 -0.0159522 -0.0174219 0.0317869
text259 -0.0407635 -0.0350779 -0.0087689 -0.0214375
text260 -0.0341712 -0.0247707 -0.0196006 -0.0180279
text261 -0.0227580 -0.0123446 -0.0140167 -0.0178309
text262 -0.0135272 -0.0088708 -0.0083216 -0.0064179
text263 -0.0380947 -0.0211108 -0.0218322 -0.0192240
text264 -0.0330569 -0.0178655 -0.0192929 -0.0192518
text265 -0.0054348 -0.0038302 -0.0019785 -0.0039396
text266 -0.0399768 -0.0230691 -0.0229582 -0.0085886
text267 -0.0221749 -0.0138633 -0.0120751 -0.0139932
text268 -0.0322408 -0.0196556 -0.0100926 -0.0136040
text269 -0.0202001 -0.0124859 -0.0078163 -0.0142797
text270 -0.0044583 -0.0026372 0.0002530 -0.0019551
text271 -0.0297346 -0.0054198 -0.0061643 -0.0022135
text272 -0.0194890 -0.0047769 0.0039761 -0.0009183
text273 -0.0431601 0.0084116 0.0045050 -0.0071400
text274 -0.0466055 0.0032941 0.0057470 -0.0216533
text275 -0.0288608 0.0093559 -0.0026053 0.0026277
text276 -0.0067919 -0.0019469 0.0018848 -0.0006070
text277 -0.0303516 -0.0242041 -0.0262786 -0.0260823
text278 -0.0326111 -0.0147446 -0.0240686 -0.0170426
text279 -0.0201657 -0.0081619 -0.0085706 -0.0110846
text280 -0.0262646 -0.0199488 0.0008773 -0.0137001
text281 -0.0397643 -0.0243785 -0.0229735 -0.0458833
text282 -0.0274489 -0.0149223 -0.0212432 -0.0312039
text283 -0.0180815 -0.0098454 -0.0140733 -0.0224617
text284 -0.0172344 -0.0051750 0.0005827 -0.0085258
text285 -0.0190598 -0.0067878 -0.0046155 -0.0078051
text286 -0.0251046 -0.0096385 -0.0074740 0.0008699
text287 -0.0248434 -0.0082188 -0.0113539 -0.0233393
text288 -0.0242661 -0.0092630 -0.0030359 -0.0092863
text289 -0.0443412 0.0156223 -0.0104090 -0.0081884
text290 -0.0270467 0.0230244 -0.0019740 0.0003582
text291 -0.0299375 -0.0040003 -0.0070375 -0.0121910
text292 -0.0346713 -0.0036253 -0.0046902 -0.0004109
text293 -0.0411983 -0.0007309 -0.0068424 -0.0179100
text294 -0.0235474 0.0008847 -0.0010834 -0.0032301
text295 -0.0239141 -0.0137348 -0.0132229 -0.0229287
text296 -0.0277973 -0.0201516 -0.0169662 -0.0308969
text297 -0.0220373 -0.0220482 -0.0229620 0.0142208
text298 -0.0209828 -0.0251034 -0.0369027 0.0242866
text299 -0.0172167 -0.0207196 -0.0173725 0.0347033
text300 -0.0199645 -0.0141655 -0.0019867 -0.0140629
text301 -0.0317365 -0.0403694 -0.0444473 0.0353125
text302 -0.0055955 -0.0024581 -0.0015819 -0.0005709
text303 -0.0432379 0.0283588 -0.0100920 -0.0108971
text304 -0.0161124 -0.0074151 -0.0110046 -0.0182499
text305 -0.0394922 -0.0635659 -0.0834449 0.1448294
text306 -0.0309482 -0.0489124 -0.0553892 0.0945847
text307 -0.0291743 -0.0330939 -0.0294646 0.0467790
text308 -0.0327501 -0.0406849 -0.0421681 0.0819711
text309 -0.0326729 -0.0458150 -0.0665417 0.0949785
text310 -0.0236917 -0.0162992 0.0097466 0.0040859
text311 -0.0157381 -0.0192759 -0.0200462 0.0336286
text312 -0.0171310 -0.0095126 0.0113006 -0.0008799
text313 -0.0194150 -0.0111687 0.0136703 0.0016774
text314 -0.0205557 -0.0119516 0.0120695 -0.0043445
text315 -0.0283431 -0.0364369 -0.0170847 0.0500241
text316 -0.0197030 -0.0136789 -0.0034523 0.0161132
text317 -0.0163637 -0.0072506 0.0139868 -0.0008245
text318 -0.0178156 -0.0075640 0.0075737 -0.0011730
text319 -0.0137570 -0.0129992 -0.0037327 0.0159656
text320 -0.0366067 -0.0598723 -0.0926406 0.1219409
text321 -0.0139817 -0.0094438 -0.0189578 0.0272923
text322 -0.0169600 -0.0240522 -0.0437432 0.0606252
text323 -0.0217201 -0.0383301 -0.0698350 0.1059621
text324 -0.0135565 -0.0177476 -0.0200358 0.0243249
text325 -0.0258776 -0.0361417 -0.0455831 0.0607307
text326 -0.0197636 -0.0363579 -0.0439066 0.0799823
text327 -0.0227788 -0.0461926 -0.0576771 0.1003570
text328 -0.0193287 -0.0348560 -0.0481457 0.0723066
text329 -0.0220766 -0.0212481 -0.0269305 0.0324629
text330 -0.0078599 -0.0168461 -0.0263499 0.0422776
text331 -0.0154088 -0.0046447 0.0051342 0.0018788
text332 -0.0144273 -0.0095262 -0.0011128 -0.0075427
text333 -0.0086210 -0.0030219 0.0031027 0.0000791
text334 -0.0102068 0.0002965 0.0017731 0.0017933
text335 -0.0102732 0.0038088 -0.0021418 0.0009230
text336 -0.0081120 -0.0038449 0.0026214 0.0027991
text337 -0.0232730 -0.0184933 -0.0086261 -0.0161417
text338 -0.0238373 -0.0145818 0.0016052 -0.0072770
text339 -0.0235203 -0.0204415 -0.0072919 -0.0129097
text340 -0.0288319 -0.0176056 -0.0065668 -0.0129688
text341 -0.0235522 -0.0163492 -0.0146194 -0.0221828
text342 -0.0381256 -0.0418702 -0.0495327 0.0539658
text343 -0.0340261 -0.0379876 -0.0202655 0.0258170
text344 -0.0129904 -0.0118274 -0.0147942 0.0086633
text345 -0.0369682 -0.0308761 -0.0610503 -0.0725546
text346 -0.0190705 -0.0197866 -0.0211130 -0.0201386
text347 -0.0368647 -0.0129850 -0.0227289 -0.0505422
text348 -0.0284511 -0.0147121 -0.0277722 -0.0549649
text349 -0.0283361 -0.0119966 -0.0278210 -0.0476515
text350 -0.0312219 -0.0153928 -0.0383104 -0.0633862
text351 -0.0354287 -0.0364945 -0.0231025 -0.0210937
text352 -0.0187557 -0.0184571 -0.0104987 0.0049906
text353 -0.0254762 -0.0172014 0.0027929 -0.0083800
text354 -0.0178243 -0.0156209 -0.0000176 -0.0107426
text355 -0.0129080 -0.0072897 0.0020499 -0.0006130
text356 -0.0169727 -0.0143247 0.0042629 -0.0024151
text357 -0.0149944 -0.0122690 0.0016502 -0.0091188
text358 -0.0120342 -0.0074013 0.0023342 -0.0020771
text359 -0.0263500 -0.0038815 0.0015552 -0.0013583
text360 -0.0130518 -0.0086395 -0.0050943 -0.0071921
text361 -0.0170288 -0.0148549 -0.0035735 -0.0066094
text362 -0.0167936 -0.0123261 0.0006266 0.0002168
text363 -0.0247595 -0.0183766 0.0032901 0.0017089
text364 -0.0187597 -0.0089116 0.0037847 -0.0007301
text365 -0.0148106 -0.0128299 -0.0010189 -0.0053092
text366 -0.0179649 -0.0241776 -0.0274004 0.0191663
text367 -0.0157969 -0.0084605 -0.0032078 -0.0028871
text368 -0.0280186 0.0055407 -0.0108395 -0.0105478
text369 -0.0262151 0.0017429 -0.0080561 -0.0122108
text370 -0.0498298 0.0218311 -0.0129995 -0.0181989
text371 -0.0282634 -0.0218950 -0.0189717 0.0273044
text372 -0.0353378 -0.0255374 -0.0308221 -0.0473844
text373 -0.0459532 -0.0406053 -0.0350964 -0.0403048
text374 -0.0305446 -0.0198270 -0.0182337 -0.0240233
text375 -0.0529340 -0.0247542 -0.0442989 -0.0612563
text376 -0.0571024 -0.0333600 -0.0588866 -0.0878986
text377 -0.0622457 -0.0374083 -0.0500227 -0.0751456
text378 -0.0194441 -0.0043908 -0.0041400 -0.0062433
text379 -0.0308964 -0.0202807 -0.0210040 -0.0249212
text380 -0.0316297 -0.0215283 -0.0217319 -0.0300216
text381 -0.0310951 -0.0148045 -0.0073875 -0.0166905
text382 -0.0280910 -0.0120268 -0.0089425 -0.0149882
text383 -0.0188589 -0.0052827 -0.0064867 -0.0103654
text384 -0.0290477 0.0071901 -0.0029570 -0.0103970
text385 -0.0289609 0.0149845 0.0056491 -0.0031774
text386 -0.0326683 -0.0012851 -0.0010594 -0.0092603
text387 -0.0253714 -0.0085920 0.0057400 -0.0095991
text388 -0.0263389 -0.0188907 -0.0171537 -0.0260545
text389 -0.0298703 -0.0216602 -0.0207426 -0.0271188
text390 -0.0090423 -0.0024265 -0.0021142 -0.0060335
text391 -0.0436555 -0.0183643 -0.0351154 0.0268778
text392 -0.0497374 -0.0254897 -0.0454854 0.0116621
text393 -0.0267461 -0.0100211 0.0039268 0.0057595
text394 -0.0354946 -0.0260984 0.0013988 -0.0110964
text395 -0.0204044 -0.0137410 0.0049592 -0.0033423
text396 -0.0351794 -0.0171872 0.0082704 -0.0116442
text397 -0.0436555 -0.0183643 -0.0351154 0.0268778
text398 -0.0497374 -0.0254897 -0.0454854 0.0116621
text399 -0.0368629 -0.0263240 -0.0513811 -0.0950757
text400 -0.0259037 -0.0219388 -0.0167730 -0.0547199
text401 -0.0368457 -0.0280660 -0.0328888 -0.0797909
text402 -0.0361311 -0.0085427 -0.0430633 -0.0817451
text403 -0.0071000 -0.0051373 -0.0085707 -0.0121793
text404 -0.0368629 -0.0263240 -0.0513811 -0.0950757
text405 -0.0259037 -0.0219388 -0.0167730 -0.0547199
text406 -0.0368457 -0.0280660 -0.0328888 -0.0797909
text407 -0.0361311 -0.0085427 -0.0430633 -0.0817451
text408 -0.0071000 -0.0051373 -0.0085707 -0.0121793
text409 -0.0257140 -0.0188100 -0.0160243 -0.0113892
text410 -0.0319935 -0.0140543 -0.0071894 0.0004656
text411 -0.0296585 -0.0012403 -0.0094809 -0.0007853
text412 -0.0298442 -0.0107554 -0.0107297 -0.0094677
text413 -0.0221148 -0.0050586 -0.0034951 -0.0007771
text414 -0.0160776 -0.0075574 0.0078598 -0.0018173
text415 -0.0290573 -0.0298603 -0.0163727 0.0428691
text416 -0.0272785 -0.0434421 -0.0622267 0.0857726
text417 -0.0365294 -0.0401745 -0.0274416 0.0331779
text418 -0.0321156 -0.0462311 -0.0402292 0.0660197
text419 -0.0386655 -0.0734811 -0.0795658 0.1348691
text420 -0.0470763 -0.0877337 -0.0948155 0.1677701
text421 -0.0062148 -0.0138030 -0.0179831 0.0251626
text422 -0.0666388 -0.0406980 -0.0236349 -0.0461632
text423 -0.0224205 -0.0082004 -0.0084185 -0.0158633
text424 -0.0259942 -0.0339630 -0.0484243 0.0660561
text425 -0.0136326 -0.0103713 -0.0084901 0.0136657
text426 -0.0104640 -0.0076208 -0.0015851 0.0072655
text427 -0.0035158 -0.0047450 -0.0052485 0.0080769
text428 -0.0288175 -0.0254508 -0.0108744 -0.0216266
text429 -0.0324753 -0.0150993 -0.0107988 -0.0161364
text430 -0.0284265 -0.0127094 -0.0198099 -0.0257226
text431 -0.0403398 -0.0153993 -0.0140006 -0.0333492
text432 -0.0275269 0.0011204 0.0019853 -0.0116905
text433 -0.0163468 -0.0047896 -0.0057490 -0.0163574
text434 -0.0258485 -0.0112923 -0.0066834 0.0273810
text435 -0.0097345 -0.0092935 -0.0095607 0.0099838
text436 -0.0353499 0.0111527 -0.0188634 -0.0162284
text437 -0.0362800 0.0110523 -0.0193123 -0.0101963
text438 -0.0099450 0.0026719 -0.0035342 -0.0031696
text439 -0.0394380 -0.0313926 -0.0488659 0.0356176
text440 -0.0521717 -0.0929257 -0.1395940 0.2120080
text441 -0.0630690 -0.0864585 -0.1308262 0.2197539
text442 -0.0540169 -0.0346018 -0.0581156 -0.0912275
text443 -0.0315201 -0.0143905 -0.0298582 -0.0482835
text444 -0.0246648 -0.0100141 0.0069702 0.0056486
text445 -0.0290930 -0.0221421 -0.0223553 0.0146274
text446 -0.0221240 -0.0180501 -0.0086061 0.0107177
text447 -0.0217602 -0.0065419 -0.0036539 -0.0099928
text448 -0.0289987 -0.0071380 -0.0073654 -0.0010996
text449 -0.0188181 -0.0106611 0.0008906 0.0002663
text450 -0.0223994 -0.0235978 -0.0232514 0.0440035
text451 -0.0206194 -0.0124959 0.0111289 0.0033713
text452 -0.0224575 -0.0129213 -0.0053514 0.0097053
text453 -0.0178326 -0.0072477 0.0027499 0.0062848
text454 -0.0269041 -0.0165725 0.0030305 -0.0008109
text455 -0.0226167 -0.0271995 -0.0515045 0.0818926
text456 -0.0204869 -0.0191254 -0.0333034 0.0482212
text457 -0.0291164 -0.0281147 -0.0504570 0.0702220
text458 -0.0224518 -0.0285410 -0.0521139 0.0700273
text459 -0.0188767 -0.0116242 -0.0274536 0.0297682
text460 -0.0191328 -0.0264641 -0.0397172 0.0446275
text461 -0.0181768 -0.0206559 -0.0317361 0.0464740
text462 -0.0163838 -0.0081008 -0.0033418 0.0112726
text463 -0.0198855 -0.0112671 -0.0011881 0.0057925
text464 -0.0138469 -0.0070880 -0.0005305 0.0014776
text465 -0.0139600 0.0000411 0.0020801 -0.0017403
text466 -0.0222215 0.0028170 -0.0021774 -0.0102015
text467 -0.0275629 -0.0094447 -0.0026237 -0.0152985
text468 -0.0245210 -0.0126117 -0.0057748 -0.0088792
text469 -0.0168814 -0.0138619 -0.0013222 -0.0088997
text470 -0.0172753 -0.0082475 -0.0012895 -0.0074984
text471 -0.0248524 -0.0126778 -0.0068638 -0.0143674
text472 -0.0374540 -0.0219888 -0.0189632 -0.0299786
text473 -0.0204196 -0.0103161 -0.0057673 -0.0094072
text474 -0.0170217 -0.0078796 -0.0059086 -0.0047804
text475 -0.0195779 -0.0129425 -0.0054777 -0.0111101
text476 -0.0100388 -0.0070754 0.0029772 0.0035903
text477 -0.0232511 -0.0141750 -0.0037410 -0.0101510
text478 -0.0112313 -0.0025804 0.0033011 -0.0022769
text479 -0.0272685 -0.0165422 -0.0085752 -0.0095074
text480 -0.0276039 -0.0137681 -0.0118957 -0.0207805
text481 -0.0307625 -0.0145931 -0.0086340 -0.0113376
text482 -0.0289939 -0.0047711 -0.0105984 -0.0101497
text483 -0.0228572 -0.0103777 -0.0079897 -0.0139090
text484 -0.0163683 0.0000140 0.0054737 -0.0016481
text485 -0.0258574 -0.0036192 -0.0033580 -0.0070008
text486 -0.0193827 -0.0028239 -0.0039213 -0.0092121
text487 -0.0273093 -0.0136288 -0.0040016 -0.0121288
text488 -0.0237125 -0.0036381 -0.0054493 0.0002901
text489 -0.0262060 -0.0033988 -0.0101550 -0.0140819
text490 -0.0170432 -0.0039424 -0.0031450 -0.0060430
text491 -0.0156879 -0.0037063 -0.0001209 -0.0022580
text492 -0.0246378 -0.0031666 -0.0059921 -0.0109874
text493 -0.0123958 -0.0040821 0.0000009 -0.0028207
text494 -0.0173233 -0.0082997 0.0002139 -0.0050752
text495 -0.0164074 -0.0037524 0.0000258 -0.0039861
text496 -0.0107065 -0.0054066 0.0035581 0.0000516
text497 -0.0243836 -0.0179051 0.0441930 0.0047824
text498 -0.0229361 -0.0145550 0.0040469 -0.0079476
text499 -0.0224142 -0.0064447 0.0082815 -0.0083450
text500 -0.0168542 -0.0063025 0.0100438 -0.0042087
text501 -0.0137495 -0.0070527 0.0106003 -0.0033582
text502 -0.0039971 -0.0031191 0.0029456 0.0009665
text503 -0.0260339 -0.0341148 -0.0447767 -0.0224879
text504 -0.0163277 -0.0104224 -0.0100440 -0.0006218
text505 -0.0299634 -0.0314011 -0.0348223 0.0356784
text506 -0.0158980 -0.0167301 -0.0256458 0.0317831
text507 -0.0267255 -0.0210009 -0.0173701 -0.0119095
text508 -0.0345397 -0.0271626 -0.0288387 -0.0141154
text509 -0.0397106 -0.0191892 -0.0233383 -0.0114304
text510 -0.0278651 -0.0100917 -0.0210269 -0.0117908
text511 -0.0252933 -0.0193863 -0.0162777 -0.0204246
text512 -0.0197171 -0.0139813 -0.0033950 -0.0076965
text513 -0.0328372 -0.0176146 -0.0213334 -0.0228441
text514 -0.0304834 -0.0249734 -0.0296548 -0.0370196
text515 -0.0331441 -0.0162640 -0.0199915 -0.0324809
text516 -0.0233027 -0.0025643 0.0019230 -0.0026824
text517 -0.0221343 -0.0151559 -0.0056782 -0.0128501
text518 -0.0127842 -0.0039192 0.0009844 -0.0040637
text519 -0.0088202 -0.0036595 -0.0005335 0.0009489
text520 -0.0358220 -0.0100706 -0.0056607 -0.0092383
text521 -0.0334985 -0.0003751 -0.0038508 -0.0082319
text522 -0.0362507 -0.0062875 -0.0150291 -0.0222107
text523 -0.0152516 -0.0044415 -0.0026254 -0.0060724
text524 -0.0124815 -0.0059775 0.0045607 0.0006482
text525 -0.0253115 -0.0128870 -0.0002410 -0.0079617
text526 -0.0223663 -0.0142062 -0.0004192 -0.0019114
text527 -0.0064651 -0.0007996 -0.0010205 -0.0019762
text528 -0.0419100 -0.0445028 -0.0181366 -0.0105359
text529 -0.0316041 -0.0332471 -0.0076392 -0.0049673
text530 -0.0221971 -0.0023990 0.0037580 -0.0061731
text531 -0.0233231 -0.0092382 -0.0023273 -0.0130151
text532 -0.0212130 -0.0096673 0.0021028 -0.0139435
text533 -0.0225243 -0.0077080 -0.0049820 0.0054108
text534 -0.0281983 -0.0072415 -0.0027449 -0.0102983
text535 -0.0048686 -0.0032597 0.0024478 -0.0005960
text536 -0.0397049 0.0036639 -0.0255665 -0.0143444
text537 -0.0335563 0.0112022 -0.0250216 -0.0057308
text538 -0.0312602 0.0136455 -0.0220721 -0.0105426
text539 -0.0286084 -0.0191753 -0.0045687 -0.0014177
text540 -0.0315681 -0.0207027 -0.0090741 -0.0158584
text541 -0.0408572 -0.0334332 -0.0274171 -0.0390936
text542 -0.0227071 -0.0088737 -0.0077877 -0.0120897
text543 -0.0008521 -0.0003294 0.0008620 -0.0000378
text544 -0.0289857 -0.0269523 -0.0253070 0.0131714
text545 -0.0230226 -0.0230582 -0.0183666 -0.0000169
text546 -0.0206089 -0.0117889 -0.0024142 -0.0170095
text547 -0.0184367 -0.0074149 -0.0011514 -0.0097495
text548 -0.0320809 -0.0135392 0.0012473 -0.0094517
text549 -0.0332832 -0.0179143 0.0079879 -0.0062893
text550 -0.0142340 -0.0058904 -0.0052391 -0.0123624
text551 -0.0383906 -0.0134813 -0.0337043 -0.0335114
text552 -0.0329496 -0.0115044 -0.0240187 -0.0256918
text553 -0.0020899 -0.0011314 -0.0006082 -0.0013824
text554 -0.0280199 -0.0168308 0.0177927 -0.0011911
text555 -0.0254454 -0.0141889 -0.0066078 -0.0167946
text556 -0.0245535 -0.0244741 -0.0136829 0.0306246
text557 -0.0089531 -0.0053729 0.0064627 -0.0003147
text558 -0.0328503 -0.0249250 -0.0228534 -0.0287535
text559 -0.0200463 -0.0162458 -0.0070225 -0.0178331
text560 -0.0179315 -0.0128595 -0.0144849 0.0240196
text561 -0.0313647 -0.0191858 -0.0214704 -0.0420757
text562 -0.0133916 -0.0077721 -0.0077712 -0.0170832
text563 -0.0303579 0.0025277 -0.0066385 0.0003058
text564 -0.0276357 0.0119555 -0.0076801 0.0032605
text565 -0.0215691 0.0084118 -0.0077565 -0.0015315
text566 -0.0251248 0.0074443 0.0104920 -0.0019245
text567 -0.0316186 0.0115081 0.0068938 -0.0036949
text568 -0.0290347 0.0206850 0.0079980 -0.0028615
text569 -0.0037073 -0.0001436 0.0000024 -0.0009478
text570 -0.0055505 0.0005994 0.0037335 0.0028113
text571 -0.0254234 0.0457133 -0.0040744 -0.0004550
text572 -0.0622423 0.0840710 -0.0305962 -0.0202446
text573 -0.0367017 0.0458965 -0.0206147 -0.0053002
text574 -0.0474161 0.0685192 -0.0237016 -0.0160642
text575 -0.0502052 0.0817140 -0.0242688 -0.0055649
text576 -0.0448352 0.0725931 -0.0144784 -0.0149813
text577 -0.0444258 0.0664334 -0.0114701 -0.0128777
text578 -0.0118553 0.0096390 0.0014300 -0.0024410
text579 -0.0379341 0.0635614 -0.0010403 0.0311827
text580 -0.0192485 0.0386212 -0.0025918 0.0117829
text581 -0.0419076 0.0853981 -0.0077020 0.0158721
text582 -0.0262598 0.0484323 -0.0053109 0.0017422
text583 -0.0171195 0.0052379 0.0057537 0.0062090
text584 -0.0173778 0.0032560 0.0042713 0.0080533
text585 -0.0128889 0.0019676 0.0010150 0.0012857
text586 -0.0176394 0.0107292 0.0045012 0.0072988
text587 -0.0266898 0.0117117 0.0125798 0.0125672
text588 -0.0195852 -0.0101858 0.0078482 0.0010991
text589 -0.0111783 -0.0020389 0.0041864 0.0030653
text590 -0.0178375 -0.0046055 0.0108346 0.0064234
text591 -0.0104228 0.0017743 0.0056877 0.0039272
text592 -0.0094644 0.0018550 0.0038253 0.0037422
text593 -0.0257460 0.0312200 -0.0015056 0.0029136
text594 -0.0166574 0.0219383 -0.0014224 0.0029402
text595 -0.0142036 0.0123382 -0.0004934 0.0041104
text596 -0.0190634 0.0276949 -0.0042184 0.0027636
text597 -0.0130055 0.0055591 -0.0017011 -0.0021029
text598 -0.0119035 0.0124042 -0.0032786 -0.0006692
text599 -0.0164604 0.0183039 0.0103315 0.0085056
text600 -0.0168153 0.0219224 0.0085847 0.0084119
text601 -0.0377770 0.0722292 0.0170910 0.0195694
text602 -0.0302069 0.0354168 0.0161425 0.0077340
text603 -0.0173197 0.0224795 0.0072018 0.0063572
text604 -0.0270717 0.0094226 0.0047732 -0.0011091
text605 -0.0456655 0.0314239 0.0037593 -0.0086440
text606 -0.0337375 0.0217090 0.0120919 0.0013086
text607 -0.0031113 0.0008849 -0.0000324 0.0007519
text608 -0.0451029 0.0746581 -0.0003599 0.0030026
text609 -0.0440016 0.0896602 -0.0065864 -0.0012923
text610 -0.0105484 0.0169101 -0.0006183 0.0005149
text611 -0.0291341 0.0456318 0.0008407 0.0244281
text612 -0.0229634 0.0346805 0.0036032 0.0257683
text613 -0.0169905 0.0164386 -0.0043894 0.0129650
text614 -0.0189323 0.0195810 -0.0004217 0.0208135
text615 -0.0317825 0.0434115 0.0054832 0.0424141
text616 -0.0236389 0.0477838 0.0005176 0.0391151
text617 -0.0222824 0.0045553 0.0117612 -0.0010583
text618 -0.0234194 0.0111217 0.0175632 0.0051022
text619 -0.0210271 0.0074020 0.0107578 0.0016588
text620 -0.0022949 -0.0001246 0.0013131 0.0009173
text621 -0.0375433 0.0674869 0.0054537 0.0144207
text622 -0.0315758 0.0634594 0.0025177 0.0132800
text623 -0.0373749 0.0690006 -0.0016824 0.0187813
text624 -0.0190790 0.0266915 -0.0031898 0.0067003
text625 -0.0073686 0.0069744 0.0019274 0.0004889
text626 -0.0171487 0.0267521 0.0001566 0.0122188
text627 -0.0199639 0.0188429 0.0019094 0.0090927
text628 -0.0232374 -0.0208333 0.0510754 0.0097716
text629 -0.0111765 -0.0015993 0.0116179 0.0039148
text630 -0.0203616 0.0258339 0.0014398 0.0068628
text631 -0.0099724 0.0038569 0.0038264 0.0010572
text632 -0.0005248 -0.0005730 0.0013519 0.0002983
text633 -0.0285555 0.0412034 -0.0003188 0.0008426
text634 -0.0272780 0.0486383 -0.0113116 -0.0021568
text635 -0.0268496 0.0316665 -0.0014886 -0.0042521
text636 -0.0288033 0.0279975 -0.0091991 -0.0135648
text637 -0.0282466 0.0378363 -0.0010545 -0.0003422
text638 -0.0270660 0.0385656 -0.0075810 -0.0040499
text639 -0.0234600 0.0125627 -0.0052362 -0.0096868
text640 -0.0252644 0.0144561 0.0014817 -0.0038126
text641 -0.0256678 0.0139949 0.0031772 -0.0042880
text642 -0.0342026 0.0621633 -0.0195836 -0.0028403
text643 -0.0393876 0.0648435 -0.0143551 -0.0053561
text644 -0.0293567 0.0398283 -0.0064242 -0.0055853
text645 -0.0219733 0.0210189 0.0018502 0.0005097
text646 -0.0252217 0.0268025 0.0002884 -0.0035034
text647 -0.0339558 0.0522816 -0.0106609 -0.0078484
text648 -0.0298373 0.0250685 -0.0060677 -0.0094424
text649 -0.0203169 0.0167926 -0.0000537 0.0013914
text650 -0.0071797 0.0196397 -0.0023585 0.0024520
text651 -0.0230640 0.0143486 0.0027440 -0.0040740
text652 -0.0156805 0.0034706 0.0038375 -0.0076249
text653 -0.0303389 0.0187975 -0.0004512 -0.0092513
text654 -0.0286835 0.0388051 -0.0045846 -0.0024053
text655 -0.0197327 0.0189884 -0.0050106 -0.0053010
text656 -0.0263737 0.0374947 -0.0021203 -0.0029065
text657 -0.0140629 0.0016107 0.0069427 0.0045253
text658 -0.0154354 0.0024629 0.0048529 0.0005028
text659 -0.0251970 0.0031320 0.0002692 -0.0057045
text660 -0.0230712 0.0126422 0.0033839 0.0028966
text661 -0.0235175 0.0240642 0.0016189 -0.0007201
text662 -0.0184104 0.0056458 -0.0010726 -0.0047038
text663 -0.0221459 0.0045259 0.0059984 0.0024481
text664 -0.0268996 0.0040945 0.0085298 -0.0023228
text665 -0.0220745 0.0052996 0.0085345 0.0010538
text666 -0.0250124 0.0054977 0.0159495 -0.0003882
text667 -0.0235475 0.0011172 0.0109022 -0.0011793
text668 -0.0122163 0.0006434 0.0058127 0.0001623
text669 -0.0263437 0.0134467 0.0043387 0.0009806
text670 -0.0330743 0.0306129 0.0050873 -0.0037436
text671 -0.0255156 0.0113589 0.0101505 0.0041323
text672 -0.0298980 0.0162548 0.0099988 0.0029024
text673 -0.0165889 0.0027398 0.0045045 -0.0038213
text674 -0.0201921 0.0046594 0.0076304 0.0010909
text675 -0.0259793 0.0217406 0.0019212 0.0395126
text676 -0.0116995 0.0150872 0.0014315 0.0183861
text677 -0.0184914 0.0110918 0.0006314 -0.0054940
text678 -0.0179973 0.0002926 0.0002815 -0.0042074
text679 -0.0147593 0.0046391 -0.0019491 -0.0086989
text680 -0.0150270 0.0005598 -0.0018987 -0.0034825
text681 -0.0221121 0.0148391 -0.0012217 -0.0039918
text682 -0.0326295 0.0221751 -0.0035419 -0.0097144
text683 -0.0175412 0.0064427 0.0030010 -0.0044335
text684 -0.0233178 0.0017170 0.0071905 -0.0044254
text685 -0.0218017 0.0017496 0.0075261 0.0005815
text686 -0.0008341 -0.0000484 0.0003167 -0.0003152
text687 -0.0294826 0.0525483 0.0025487 0.0159175
text688 -0.0218478 0.0594675 -0.0056036 0.0100702
text689 -0.0324681 0.0275891 0.0096299 0.0958941
text690 -0.0248408 0.0130299 0.0063562 0.0606159
text691 -0.0276788 0.0424581 0.0005674 0.0695705
text692 -0.0330168 0.0322118 0.0036039 0.0742887
text693 -0.0234116 0.0333794 0.0013782 0.0575950
text694 -0.0172678 0.0123975 0.0033785 0.0256682
text695 -0.0265600 0.0263979 0.0073869 0.0201792
text696 -0.0151218 0.0154053 0.0073779 0.0063240
text697 -0.0365382 0.0430926 0.0058845 0.0034945
text698 -0.0314010 0.0368271 0.0065586 0.0013751
text699 -0.0183055 0.0135359 0.0023247 -0.0046471
text700 -0.0064287 0.0062137 0.0022448 0.0005478
text701 -0.0201975 0.0062831 0.0081265 0.0041036
text702 -0.0281952 0.0001075 0.0182931 0.0032085
text703 -0.0151371 0.0026053 0.0062732 0.0006684
text704 -0.0143971 0.0025503 0.0060462 0.0025095
text705 -0.0251493 0.0183700 0.0040365 -0.0024029
text706 -0.0055713 -0.0011378 0.0050554 0.0009880
text707 -0.0177151 0.0313001 -0.0023651 0.0034780
text708 -0.0223308 0.0258816 -0.0016152 -0.0003392
text709 -0.0398983 0.0569266 -0.0001252 0.0018040
text710 -0.0258718 0.0200117 -0.0029890 -0.0037680
text711 -0.0222132 0.0222822 0.0001243 0.0002503
text712 -0.0268489 0.0537548 -0.0030923 0.0067588
text713 -0.0196556 0.0314323 -0.0018647 0.0014590
text714 -0.0169696 0.0099146 0.0060030 0.0011907
text715 -0.0287600 0.0233242 -0.0066637 0.0035380
text716 -0.0484245 0.0157601 -0.0152792 0.0094048
text717 -0.0149937 0.0063315 -0.0017424 0.0038337
text718 -0.0332587 0.0352898 0.0166731 0.0085463
text719 -0.0271764 0.0265306 0.0097501 0.0037025
text720 -0.0319614 0.0358243 -0.0019429 -0.0053223
text721 -0.0285573 0.0414750 0.0051234 0.0112652
text722 -0.0244615 0.0023326 0.0039703 0.0195343
text723 -0.0243068 -0.0050690 0.0148262 0.0189064
text724 -0.0245378 0.0062035 0.0103577 0.0180337
text725 -0.0209935 0.0105354 0.0037799 0.0207526
text726 -0.0342847 0.0254355 0.0052807 0.0192268
text727 -0.0133441 0.0133247 0.0046170 0.0188285
text728 -0.0181619 0.0110725 0.0050446 0.0120632
text729 -0.0185992 0.0035101 0.0061049 0.0027559
text730 -0.0001732 -0.0000771 0.0001284 0.0000621
text731 -0.0143044 0.0282050 -0.0000299 0.0061347
text732 -0.0278966 0.0388279 0.0011675 0.0032374
text733 -0.0227865 0.0382998 -0.0009308 0.0050350
text734 -0.0291057 0.0421191 0.0064457 0.0088821
text735 -0.0281958 0.0590898 -0.0049763 0.0086137
text736 -0.0306648 0.0602258 -0.0037982 0.0058902
text737 -0.0233533 0.0276011 -0.0001171 -0.0014693
text738 -0.0266427 0.0238504 -0.0023232 0.0018872
text739 -0.0411281 0.0520672 0.0002702 0.0065194
text740 -0.0052262 0.0024649 0.0009380 0.0000647
text741 -0.0177528 0.0025667 0.0105687 0.0058623
text742 -0.0246961 -0.0034134 0.0135816 0.0005735
text743 -0.0206662 0.0012294 0.0146495 0.0012764
text744 -0.0155068 -0.0056620 0.0143713 -0.0006556
text745 -0.0126745 -0.0065477 0.0051615 -0.0007450
text746 -0.0068510 -0.0009971 0.0034338 -0.0008259
text747 -0.0231048 0.0382069 -0.0004206 0.0147966
text748 -0.0272097 0.0449351 0.0002707 0.0114755
text749 -0.0232822 0.0297316 0.0034152 -0.0004784
text750 -0.0239380 0.0345408 0.0007213 -0.0013759
text751 -0.0003473 0.0000109 0.0001063 0.0001000
text752 -0.0358348 0.0404270 -0.0069765 0.0109885
text753 -0.0148963 0.0167376 -0.0007375 0.0057612
text754 -0.0307180 0.0360638 0.0055145 -0.0020696
text755 -0.0384901 0.0450181 -0.0007546 -0.0063306
text756 -0.0201793 0.0125803 0.0005105 -0.0026977
text757 -0.0247000 0.0219591 0.0042964 -0.0018364
text758 -0.0260638 0.0241676 0.0048360 0.0004432
text759 -0.0340141 0.0290120 -0.0078927 -0.0131849
text760 -0.0306821 0.0360198 -0.0002326 -0.0010770
text761 -0.0410060 0.0364852 -0.0135248 -0.0222282
text762 -0.0318848 0.0345422 -0.0001567 0.0002855
text763 -0.0108322 0.0100202 0.0000118 0.0004832
text764 -0.0231008 0.0012211 0.0146265 0.0045819
text765 -0.0272630 0.0069008 0.0198868 0.0065063
text766 -0.0205631 -0.0071754 0.0089083 -0.0041171
text767 -0.0270629 0.0098756 0.0107202 -0.0030073
text768 -0.0311228 0.0057800 0.0150993 -0.0020896
text769 -0.0336961 0.0150366 0.0095644 0.0009296
text770 -0.0064343 -0.0020907 0.0063269 0.0002337
text771 -0.0289891 0.0602131 -0.0024564 0.0158551
text772 -0.0266258 0.0366021 0.0018702 0.0098373
text773 -0.0352914 0.0628796 -0.0023353 0.0322223
text774 -0.0238592 0.0685545 -0.0029754 0.0382096
text775 -0.0291969 0.0644272 -0.0029621 0.0460548
text776 -0.0233778 0.0309343 -0.0016700 0.0095488
text777 -0.0228718 0.0142099 0.0136940 0.0112264
text778 -0.0356685 0.0058738 0.0170273 0.0008987
text779 -0.0144526 0.0007495 0.0121131 -0.0006222
text780 -0.0213932 0.0022029 0.0079014 -0.0005303
text781 -0.0022675 -0.0001407 0.0018768 0.0000027
text782 -0.0323464 0.0407789 0.0029929 0.0054801
text783 -0.0252892 0.0362123 0.0025821 0.0114609
text784 -0.0169707 0.0203712 0.0010805 0.0058664
text785 -0.0369689 0.0437560 0.0112681 0.0121757
text786 -0.0245449 0.0345591 0.0050451 0.0056654
text787 -0.0205265 0.0247393 -0.0024719 -0.0030624
text788 -0.0237019 0.0302302 -0.0005076 -0.0042072
text789 -0.0186857 0.0203644 0.0001090 -0.0025428
text790 -0.0240670 0.0150798 0.0075725 0.0007033
text791 -0.0268843 -0.0025208 0.0098521 -0.0030419
text792 -0.0167968 0.0015056 0.0068416 0.0008872
text793 -0.0190150 0.0085609 0.0078822 0.0086992
text794 -0.0201708 0.0193689 -0.0002546 0.0085273
text795 -0.0307276 0.0650878 -0.0020586 0.0142869
text796 -0.0145188 0.0244001 0.0023468 0.0117541
text797 -0.0295926 0.0598833 -0.0003212 0.0205994
text798 -0.0223948 0.0384099 -0.0015949 0.0153246
text799 -0.0204531 0.0264376 0.0038712 0.0064874
text800 -0.0109601 0.0017521 0.0037075 0.0040081
text801 -0.0152867 0.0110898 0.0019766 0.0067568
text802 -0.0130945 0.0048574 0.0000126 0.0039167
text803 -0.0303301 0.0362430 0.0055545 0.0119072
text804 -0.0346882 0.0377897 0.0023708 0.0028386
text805 -0.0347460 0.0336957 0.0042148 0.0065983
text806 -0.0318949 0.0420702 -0.0194040 -0.0063355
text807 -0.0119577 0.0104761 -0.0050665 -0.0044619
text808 -0.0213345 0.0414209 -0.0017419 0.0194532
text809 -0.0355143 0.0856837 -0.0028925 0.0260430
text810 -0.0318531 0.0572162 -0.0006690 0.0163769
text811 -0.0286625 0.0520201 -0.0058168 0.0136456
text812 -0.0332555 0.0115350 -0.0129684 -0.0288393
text813 -0.0460313 0.0130825 -0.0187616 -0.0504652
text814 -0.0380797 0.0328016 -0.0179468 -0.0190681
text815 -0.0163871 0.0067259 -0.0085099 -0.0116168
text816 -0.0209306 0.0063505 0.0002919 0.0039765
text817 -0.0153556 0.0043953 0.0043424 0.0061796
text818 -0.0174380 0.0297192 0.0028518 0.0112530
text819 -0.0099384 -0.0020902 0.0061286 0.0011073
text820 -0.0130774 0.0060792 0.0044857 0.0049566
text821 -0.0250987 0.0104957 0.0003294 0.0066760
text822 -0.0218886 0.0206634 -0.0050679 0.0073793
text823 -0.0120309 0.0061304 0.0022196 0.0028851
text824 -0.0094658 0.0008523 0.0009146 0.0019154
text825 -0.0015952 0.0049794 -0.0002685 0.0010653
text826 -0.0448826 0.0921015 -0.0154622 0.0235030
text827 -0.0451989 0.0832044 -0.0064745 0.0293726
text828 -0.0443781 0.0381317 -0.0191933 0.0036721
text829 -0.0380980 0.0442711 -0.0075811 -0.0011929
text830 -0.0147009 0.0097894 -0.0002952 -0.0032324
text831 -0.0351014 0.0147570 0.0104433 -0.0026567
text832 -0.0443763 0.0127821 0.0223146 0.0079202
text833 -0.0400392 0.0345471 -0.0114330 -0.0134345
text834 -0.0448987 0.0579230 -0.0127463 -0.0154831
text835 -0.0464682 0.0367860 -0.0044169 -0.0090635
text836 -0.0154721 0.0067464 0.0026355 -0.0022947
text837 -0.0112470 0.0142600 0.0001295 0.0038449
text838 -0.0282485 0.0267754 -0.0074074 0.0027968
text839 -0.0222009 0.0219108 -0.0090204 0.0007667
text840 -0.0186223 0.0167244 -0.0081306 -0.0045538
text841 -0.0259896 0.0394396 -0.0083757 -0.0045524
text842 -0.0213902 0.0247613 -0.0047969 -0.0036327
text843 -0.0282086 0.0374140 -0.0057775 -0.0057913
text844 -0.0431235 0.0405747 0.0173430 0.0026627
text845 -0.0482202 0.0464304 0.0149178 -0.0061781
text846 -0.0069274 0.0040227 0.0005559 -0.0015400
text847 -0.0329974 0.0251860 -0.0007935 -0.0109707
text848 -0.0402267 0.0616039 -0.0043738 -0.0061732
text849 -0.0465591 0.0892869 -0.0085016 -0.0043306
text850 -0.0365779 0.0586351 -0.0060465 -0.0070547
text851 -0.0002838 -0.0000618 0.0002489 0.0000980
text852 -0.0456035 0.0241563 -0.0046082 -0.0075891
text853 -0.0360190 0.0114290 -0.0095300 -0.0141206
text854 -0.0465355 0.0347736 -0.0069596 -0.0072649
text855 -0.0152913 0.0035681 -0.0000454 0.0017518
text856 -0.0229131 0.0273527 -0.0067771 0.0009098
text857 -0.0387827 0.0676602 -0.0176577 0.0019284
text858 -0.0348759 0.0693582 -0.0251489 -0.0046311
text859 -0.0346929 0.0498598 -0.0126207 -0.0100523
text860 -0.0433119 0.0879290 -0.0083658 0.0058832
text861 -0.0242736 0.0372177 -0.0004110 0.0375299
text862 -0.0341259 0.0820039 -0.0051406 0.0254618
text863 -0.0445230 0.0753160 0.0013732 0.0185358
text864 -0.0025641 0.0009739 -0.0001450 0.0006092
text865 -0.0142168 0.0089706 0.0076364 0.0021647
text866 -0.0297039 0.0299150 0.0060302 0.0067023
text867 -0.0349369 0.0410755 0.0016136 -0.0011405
text868 -0.0231776 0.0628362 -0.0055439 0.0136257
text869 -0.0415899 0.1032760 -0.0090069 0.0224176
text870 -0.0339438 0.0674425 -0.0089763 0.0076707
text871 -0.0337707 0.0408385 0.0054640 0.0182210
text872 -0.0402399 0.0897367 -0.0085321 0.0224733
text873 -0.0320699 0.0597322 -0.0020572 0.0112293
text874 -0.0171929 0.0219193 0.0007448 0.0060973
text875 -0.0474566 0.0781342 -0.0059912 0.0100717
text876 -0.0331743 0.0567434 -0.0058854 0.0031016
text877 -0.0199596 0.0375070 -0.0036501 0.0007932
text878 -0.0528764 0.0888338 -0.0128526 -0.0081905
text879 -0.0063761 0.0066353 0.0022164 0.0005248
text880 -0.0356680 0.0221865 0.0162887 0.0038154
text881 -0.0337523 0.0040092 0.0171010 0.0024610
text882 -0.0317422 0.0239174 0.0090630 0.0029088
text883 -0.0090004 0.0067657 0.0011859 0.0005803
text884 -0.0198740 0.0294607 0.0015478 0.0003065
text885 -0.0235745 0.0140956 0.0049738 -0.0030015
text886 -0.0210413 0.0080306 0.0057263 -0.0023939
text887 -0.0181039 0.0187042 0.0031963 -0.0011140
text888 -0.0097447 0.0097773 0.0008938 0.0013136
text889 -0.0383505 0.0555825 -0.0061476 -0.0050629
text890 -0.0510995 0.0372241 -0.0168786 -0.0291522
text891 -0.0531184 0.0984009 -0.0132113 -0.0055529
text892 -0.0014824 0.0020201 0.0001771 0.0014040
text893 -0.0220195 0.0329307 -0.0037088 -0.0002855
text894 -0.0543344 0.0842363 -0.0323526 0.0032761
text895 -0.0288419 0.0317985 -0.0159975 -0.0094430
text896 -0.0191795 0.0272518 -0.0095959 -0.0046681
text897 -0.0282684 0.0433301 -0.0107100 -0.0100616
text898 -0.0052711 0.0018991 -0.0020203 -0.0039901
text899 -0.0202102 0.0176327 -0.0001023 0.0043841
text900 -0.0198055 0.0180507 -0.0006327 0.0063320
text901 -0.0190382 -0.0027098 -0.0008562 0.0032137
text902 -0.0149755 0.0088588 0.0007665 0.0048807
text903 -0.0257757 0.0305526 0.0016962 0.0053075
text904 -0.0080021 0.0068835 0.0010278 0.0006646
text905 -0.0250172 0.0246492 -0.0034599 -0.0007031
text906 -0.0315075 0.0342969 -0.0123212 -0.0015932
text907 -0.0227468 0.0394497 -0.0064555 0.0105014
text908 -0.0325728 0.0616267 -0.0078691 0.0116351
text909 -0.0243424 0.0372922 -0.0057637 0.0146793
text910 -0.0103048 0.0099929 0.0003829 0.0009863
text911 -0.0364169 0.1095102 -0.0114897 0.0081074
text912 -0.0451471 0.1048550 -0.0176215 -0.0071164
text913 -0.0489680 0.1233122 -0.0259546 -0.0099902
text914 -0.0296118 0.0652220 -0.0040458 0.0373704
text915 -0.0345062 0.0940712 -0.0024001 0.0577295
text916 -0.0277423 0.0585962 0.0066054 0.0332074
text917 -0.0463057 0.0757675 0.0075516 0.0227871
text918 -0.0270947 0.0317818 0.0038849 0.0078837
text919 -0.0201617 0.0241444 0.0010432 0.0016828
text920 -0.0330667 0.0579993 -0.0130458 -0.0148843
text921 -0.0259331 0.0189877 0.0034169 -0.0070681
text922 -0.0240337 0.0259808 0.0143502 0.0055267
text923 -0.0242736 0.0372177 -0.0004110 0.0375299
text924 -0.0341259 0.0820039 -0.0051406 0.0254618
text925 -0.0445230 0.0753160 0.0013732 0.0185358
text926 -0.0025641 0.0009739 -0.0001450 0.0006092
text927 -0.0463057 0.0757675 0.0075516 0.0227871
text928 -0.0270947 0.0317818 0.0038849 0.0078837
text929 -0.0201617 0.0241444 0.0010432 0.0016828
text930 -0.0330667 0.0579993 -0.0130458 -0.0148843
text931 -0.0259331 0.0189877 0.0034169 -0.0070681
text932 -0.0240337 0.0259808 0.0143502 0.0055267
text933 -0.0271102 0.0663286 -0.0046526 0.0116985
text934 -0.0528747 0.1117912 -0.0113337 0.0096725
text935 -0.0093253 0.0181677 -0.0035374 0.0020244
text936 -0.0452250 0.0890385 -0.0220151 0.0045936
text937 -0.0317083 0.0635611 -0.0139746 -0.0009301
text938 -0.0364169 0.1095102 -0.0114897 0.0081074
text939 -0.0451471 0.1048550 -0.0176215 -0.0071164
text940 -0.0489680 0.1233122 -0.0259546 -0.0099902
text941 -0.0296118 0.0652220 -0.0040458 0.0373704
text942 -0.0345062 0.0940712 -0.0024001 0.0577295
text943 -0.0277423 0.0585962 0.0066054 0.0332074
text944 -0.0198740 0.0294607 0.0015478 0.0003065
text945 -0.0235745 0.0140956 0.0049738 -0.0030015
text946 -0.0210413 0.0080306 0.0057263 -0.0023939
text947 -0.0181039 0.0187042 0.0031963 -0.0011140
text948 -0.0097447 0.0097773 0.0008938 0.0013136
text949 -0.0321612 0.0212443 -0.0028927 0.0025856
text950 -0.0220232 0.0019759 -0.0084276 -0.0006875
text951 -0.0329266 0.0180397 0.0044509 0.0162967
text952 -0.0372411 0.0641975 -0.0164971 0.0083752
text953 -0.0371450 0.0806741 -0.0082700 0.0168961
text954 -0.0302795 0.0467986 0.0006462 0.0103485
text955 -0.0338505 0.0331593 -0.0014993 0.0091597
text956 -0.0090037 0.0045713 -0.0005206 0.0025432
text957 -0.0425550 0.0860514 0.0016265 0.1127850
text958 -0.0246766 0.0391376 -0.0027759 0.0396524
text959 -0.0173665 0.0058163 0.0070624 0.0049658
text960 -0.0248105 0.0035831 0.0052525 0.0018951
text961 -0.0366646 0.0183849 0.0022820 -0.0032275
text962 -0.0294316 0.0329813 0.0007769 0.0010161
text963 -0.0138445 0.0002658 -0.0012837 -0.0072304
text964 -0.0408020 0.0262022 -0.0270340 0.0416703
text965 -0.0241383 0.0233672 -0.0148032 0.0214015
text966 -0.0341173 0.0203013 0.0010166 0.0288348
text967 -0.0148961 0.0205519 -0.0007146 0.0158165
text968 -0.0268821 0.0270280 0.0127560 0.0444791
text969 -0.0252640 0.0337439 0.0139745 0.0510778
text970 -0.0249958 0.0310170 0.0110040 0.0359319
text971 -0.0470272 -0.0326545 0.1208566 0.0351302
text972 -0.0125381 -0.0085591 0.0302913 0.0038915
text973 -0.0154731 -0.0084471 0.0286613 0.0070375
text974 -0.0173360 0.0020957 0.0214444 0.0102382
text975 -0.0056131 0.0016355 0.0053966 0.0020100
text976 -0.0222003 -0.0175115 0.0060163 -0.0009680
text977 -0.0138098 -0.0122621 0.0032728 -0.0006330
text978 -0.0178510 -0.0148211 0.0041498 -0.0030059
text979 -0.0115464 -0.0083463 0.0004459 -0.0028709
text980 -0.0232325 -0.0168376 0.0059982 -0.0025427
text981 -0.0210325 -0.0152036 0.0032270 0.0012958
text982 -0.0285603 -0.0179626 0.0046903 -0.0044626
text983 -0.0220295 -0.0154417 0.0026191 -0.0040239
text984 -0.0256285 -0.0235578 -0.0010661 0.0026046
text985 -0.0255021 -0.0174213 0.0197083 -0.0001143
text986 -0.0209018 -0.0130389 0.0092151 -0.0030083
text987 -0.0147851 -0.0081150 0.0104972 -0.0003340
text988 -0.0275899 -0.0180239 0.0399046 0.0036791
text989 -0.0348991 -0.0182770 0.0219337 0.0072466
text990 -0.0372703 -0.0146489 0.0330519 -0.0004354
text991 -0.0371351 -0.0167497 0.0665862 0.0152198
text992 -0.0354979 -0.0191796 0.0248302 0.0000627
text993 -0.0243754 -0.0015298 0.0257585 0.0043978
text994 -0.0012240 -0.0003165 0.0010725 -0.0003083
text995 -0.0349054 -0.0130201 0.0155796 -0.0066277
text996 -0.0221590 -0.0074224 0.0168541 -0.0023183
text997 -0.0199178 -0.0149241 0.0510025 0.0112930
text998 -0.0207668 -0.0188404 0.0484058 0.0084296
text999 -0.0250540 -0.0185392 0.0384077 0.0092102
text1000 -0.0155849 -0.0124551 0.0322648 0.0077263
text1001 -0.0091374 -0.0058421 0.0166754 0.0026362
text1002 -0.0241932 -0.0089632 0.0289038 0.0055310
text1003 -0.0280282 -0.0191892 0.0519701 0.0120413
text1004 -0.0204261 -0.0117114 0.0351232 0.0071226
text1005 -0.0169739 -0.0084673 0.0265157 0.0052482
text1006 -0.0084076 -0.0048086 0.0066705 0.0001065
text1007 -0.0144581 -0.0107126 0.0143189 0.0070735
text1008 -0.0284418 -0.0132638 0.0401672 0.0112455
text1009 -0.0297868 -0.0126024 0.0388401 0.0072836
text1010 -0.0254401 -0.0055932 0.0237178 0.0064030
text1011 -0.0284401 -0.0048377 0.0226645 -0.0022219
text1012 -0.0246866 -0.0142298 0.0233254 -0.0028564
text1013 -0.0208002 -0.0119017 0.0233529 0.0030591
text1014 -0.0131403 -0.0085198 0.0189615 0.0039824
text1015 -0.0210748 -0.0187154 0.0401971 0.0141927
text1016 -0.0239772 -0.0118048 0.0337362 0.0024035
text1017 -0.0226982 -0.0114860 0.0383857 0.0049245
text1018 -0.0071220 -0.0053205 0.0161619 0.0027188
text1019 -0.0145617 -0.0090597 0.0167397 0.0061996
text1020 -0.0114581 -0.0054636 0.0091508 -0.0015005
text1021 -0.0130699 -0.0028457 0.0100625 0.0002419
text1022 -0.0162413 -0.0020402 0.0126287 0.0040269
text1023 -0.0220275 -0.0036692 0.0221310 0.0038919
text1024 -0.0293900 -0.0201911 0.0387718 0.0049212
text1025 -0.0140034 -0.0099257 0.0192371 0.0039863
text1026 -0.0178090 -0.0117884 0.0168590 0.0016373
text1027 -0.0284837 -0.0177239 0.0345744 0.0012297
text1028 -0.0290565 -0.0094444 0.0442804 0.0079888
text1029 -0.0198982 -0.0106255 0.0295156 0.0028312
text1030 -0.0410261 -0.0303065 0.0807453 0.0145319
text1031 -0.0265166 -0.0238923 0.0623198 0.0085341
text1032 -0.0128090 -0.0054520 0.0203868 0.0033384
text1033 -0.0160658 -0.0110982 0.0235756 0.0055639
text1034 -0.0246089 -0.0188022 0.0267044 0.0055690
text1035 -0.0172468 -0.0143210 0.0312954 0.0057285
text1036 -0.0176813 -0.0110164 0.0158770 0.0028567
text1037 -0.0194000 -0.0127450 0.0206139 0.0077333
text1038 -0.0061657 -0.0026137 0.0018255 0.0040065
text1039 -0.0092464 -0.0046801 0.0048941 0.0029988
text1040 -0.0179439 -0.0110591 0.0177932 0.0038900
text1041 -0.0139677 -0.0121509 0.0192505 0.0035578
text1042 -0.0192171 -0.0100091 0.0200742 0.0050517
text1043 -0.0171263 -0.0140020 0.0299651 0.0065148
text1044 -0.0010685 -0.0008475 0.0007658 0.0010281
text1045 -0.0151357 -0.0082455 0.0150408 0.0014466
text1046 -0.0200334 -0.0049669 0.0170416 0.0019654
text1047 -0.0253160 -0.0113686 0.0185021 0.0002079
text1048 -0.0251418 -0.0161469 0.0263935 0.0027670
text1049 -0.0179139 -0.0094574 0.0250832 0.0091655
text1050 -0.0099410 -0.0036277 0.0117922 0.0030250
text1051 -0.0419064 -0.0323624 0.0538877 0.0078780
text1052 -0.0203400 -0.0085750 0.0209533 0.0041619
text1053 -0.0209042 -0.0106663 0.0188059 0.0041184
text1054 -0.0199454 -0.0118466 0.0174727 -0.0017813
text1055 -0.0276068 -0.0104652 0.0250484 0.0029041
text1056 -0.0170103 -0.0075140 0.0180475 0.0030910
text1057 -0.0010483 -0.0005873 0.0017255 0.0002276
text1058 -0.0185959 -0.0097532 0.0158973 -0.0001826
text1059 -0.0133493 0.0016092 0.0079719 0.0005306
text1060 -0.0234403 -0.0076870 0.0420423 0.0069352
text1061 -0.0171340 -0.0052601 0.0128770 0.0005937
text1062 -0.0139176 -0.0018282 0.0037099 -0.0024943
text1063 -0.0117459 -0.0051565 0.0109801 0.0032857
text1064 -0.0142069 -0.0064186 0.0132818 0.0021816
text1065 -0.0135568 -0.0079545 0.0123459 0.0015502
text1066 -0.0272307 -0.0197886 0.0530494 0.0075798
text1067 -0.0377963 -0.0370024 0.1075597 0.0183927
text1068 -0.0291618 -0.0309225 0.0992795 0.0193414
text1069 -0.0150939 -0.0129664 0.0372990 0.0088475
text1070 -0.0179114 -0.0114680 0.0106272 0.0025168
text1071 -0.0136222 -0.0100518 0.0198259 0.0035678
text1072 -0.0216074 -0.0131234 0.0155019 0.0065339
text1073 -0.0212570 -0.0137254 0.0235694 0.0043042
text1074 -0.0040555 -0.0021911 0.0052173 0.0012485
text1075 -0.0214421 -0.0019089 0.0211170 0.0125574
text1076 -0.0197200 -0.0018771 0.0226796 0.0107435
text1077 -0.0151471 -0.0094570 0.0270332 0.0066306
text1078 -0.0207795 -0.0135142 0.0353057 0.0062469
text1079 -0.0113448 -0.0071943 0.0167652 0.0021027
text1080 -0.0253072 -0.0145377 0.0408046 0.0098330
text1081 -0.0236678 -0.0192688 0.0501994 0.0116057
text1082 -0.0199404 -0.0253949 0.0925495 0.0207475
text1083 -0.0151471 -0.0094570 0.0270332 0.0066306
text1084 -0.0207795 -0.0135142 0.0353057 0.0062469
text1085 -0.0113448 -0.0071943 0.0167652 0.0021027
text1086 -0.0253072 -0.0145377 0.0408046 0.0098330
text1087 -0.0236678 -0.0192688 0.0501994 0.0116057
text1088 -0.0199404 -0.0253949 0.0925495 0.0207475
text1089 -0.0094703 -0.0102072 0.0205966 0.0054563
text1090 -0.0285917 -0.0111176 0.0301732 0.0010597
text1091 -0.0201503 -0.0165776 0.0445771 0.0083253
text1092 -0.0229321 -0.0145344 0.0300328 0.0078221
text1093 -0.0202094 -0.0145203 0.0355966 0.0065661
text1094 -0.0242018 -0.0131876 0.0254351 0.0035450
text1095 -0.0067322 -0.0068213 0.0090367 0.0015180
text1096 -0.0173636 -0.0129301 0.0214460 0.0042458
text1097 -0.0320231 -0.0235083 0.0589731 0.0121324
text1098 -0.0237467 -0.0213508 0.0395128 0.0036416
text1099 -0.0169007 -0.0123195 0.0220186 0.0059627
text1100 -0.0126996 -0.0046020 0.0148387 0.0031897
text1101 -0.0122505 -0.0099574 0.0282459 0.0077736
text1102 -0.0162773 -0.0132139 0.0360415 0.0075176
text1103 -0.0214522 -0.0292719 0.0415300 0.0063516
text1104 -0.0319549 -0.0237327 0.0412914 -0.0012595
text1105 -0.0245967 -0.0205498 0.0286367 0.0061306
text1106 -0.0304542 -0.0245625 0.0327032 -0.0067697
text1107 -0.0247435 -0.0247196 0.0311028 -0.0032040
text1108 -0.0257948 -0.0222642 0.0305271 0.0013257
text1109 -0.0262285 -0.0215731 0.0302932 0.0009773
text1110 -0.0252521 -0.0264942 0.0379437 0.0049646
text1111 -0.0192676 -0.0136515 0.0281071 0.0009447
text1112 -0.0347167 -0.0342337 0.0434928 0.0004910
text1113 -0.0244701 -0.0238497 0.0327413 -0.0063515
text1114 -0.0269921 -0.0238286 0.0322015 -0.0073444
text1115 -0.0377411 -0.0333112 0.0629164 0.0001559
text1116 -0.0368683 -0.0445966 0.0476916 0.0210657
text1117 -0.0259677 -0.0241687 0.0219585 0.0026820
text1118 -0.0301166 -0.0146272 0.0381856 0.0052224
text1119 -0.0162222 -0.0070305 0.0192267 0.0030285
text1120 -0.0145506 -0.0043810 0.0069270 -0.0007846
text1121 -0.0174923 -0.0083459 0.0042399 -0.0041995
text1122 -0.0184340 -0.0073860 0.0099225 -0.0045545
text1123 -0.0296207 -0.0162231 0.0676966 0.0100054
text1124 -0.0305605 -0.0064434 0.0522023 0.0066935
text1125 -0.0274595 -0.0109308 0.0440128 0.0060410
text1126 -0.0296254 -0.0154951 0.0416525 0.0058692
text1127 -0.0287999 -0.0160668 0.0426565 0.0016893
text1128 -0.0324509 -0.0200455 0.0714204 0.0054908
text1129 -0.0158333 -0.0077993 0.0231444 0.0040821
text1130 -0.0166545 -0.0011147 0.0105405 0.0087832
text1131 -0.0270932 0.0220911 0.0010504 0.0151892
text1132 -0.0161242 0.0165804 -0.0000142 0.0060016
text1133 -0.0225085 -0.0158844 0.0215858 -0.0027357
text1134 -0.0236146 -0.0161793 0.0167269 -0.0038898
text1135 -0.0227301 -0.0098292 0.0174373 -0.0051548
text1136 -0.0227199 -0.0126223 0.0224201 0.0015993
text1137 -0.0192956 -0.0096321 0.0170796 0.0006755
text1138 -0.0085842 -0.0039003 0.0078169 0.0004159
text1139 -0.0274357 -0.0062567 0.0176944 -0.0012854
text1140 -0.0243711 -0.0047832 0.0207549 -0.0017759
text1141 -0.0238983 -0.0069998 0.0142620 -0.0031334
text1142 -0.0174057 -0.0088408 0.0125955 0.0023863
text1143 -0.0242226 -0.0130990 0.0140379 -0.0032863
text1144 -0.0285038 -0.0043321 0.0121895 -0.0028313
text1145 -0.0064356 -0.0040815 0.0043936 0.0025266
text1146 -0.0262610 -0.0187197 0.0291913 -0.0009728
text1147 -0.0167186 -0.0064467 0.0112405 -0.0007340
text1148 -0.0207037 -0.0125794 0.0228703 0.0029014
text1149 -0.0129677 -0.0072709 0.0067545 0.0023751
text1150 -0.0187349 -0.0091253 0.0080242 -0.0066221
text1151 -0.0124255 -0.0057169 0.0065132 -0.0007442
text1152 -0.0201124 -0.0097322 0.0070874 0.0008443
text1153 -0.0196531 -0.0054270 0.0108279 0.0007538
text1154 -0.0179157 -0.0102538 0.0067125 -0.0059407
text1155 -0.0113366 -0.0059591 0.0069865 -0.0042080
text1156 -0.0062321 -0.0020855 0.0018303 -0.0020181
text1157 -0.0235368 -0.0241941 0.0584593 0.0124386
text1158 -0.0244855 -0.0106466 0.0221217 0.0053385
text1159 -0.0131993 -0.0064533 0.0062377 0.0015792
text1160 -0.0122562 -0.0080524 0.0225224 0.0060705
text1161 -0.0049293 -0.0027934 0.0094813 0.0021753
text1162 -0.0211644 -0.0125356 0.0275538 0.0085401
text1163 -0.0190492 -0.0107053 0.0194465 0.0058457
text1164 -0.0225154 -0.0145326 0.0288332 0.0076590
text1165 -0.0171225 -0.0116729 0.0278252 0.0040452
text1166 -0.0244733 -0.0117538 0.0265764 0.0098712
text1167 -0.0117007 -0.0031526 0.0143305 0.0045570
text1168 -0.0186590 -0.0083812 0.0116189 -0.0017007
text1169 -0.0238350 -0.0159615 0.0212070 0.0031323
text1170 -0.0125544 -0.0030595 0.0098875 0.0021161
text1171 -0.0252072 -0.0121408 0.0222294 -0.0036411
text1172 -0.0189061 -0.0102335 0.0133336 -0.0010737
text1173 -0.0079982 -0.0044934 0.0070771 0.0009794
text1174 -0.0186590 -0.0083812 0.0116189 -0.0017007
text1175 -0.0238350 -0.0159615 0.0212070 0.0031323
text1176 -0.0125544 -0.0030595 0.0098875 0.0021161
text1177 -0.0252072 -0.0121408 0.0222294 -0.0036411
text1178 -0.0189061 -0.0102335 0.0133336 -0.0010737
text1179 -0.0079982 -0.0044934 0.0070771 0.0009794
text1180 -0.0238085 -0.0121498 0.0100108 0.0051301
text1181 -0.0191390 -0.0114405 0.0192273 0.0013722
text1182 -0.0191605 -0.0130434 0.0106901 0.0023199
text1183 -0.0120912 -0.0079547 0.0070569 0.0010177
text1184 -0.0215433 -0.0129933 0.0162611 0.0040735
text1185 -0.0190242 -0.0081960 0.0186388 0.0028375
text1186 -0.0099459 -0.0047660 0.0080351 0.0015743
text1187 -0.0281091 -0.0187626 0.0282544 0.0051531
text1188 -0.0149369 -0.0079684 0.0112335 0.0032388
text1189 -0.0281091 -0.0187626 0.0282544 0.0051531
text1190 -0.0149369 -0.0079684 0.0112335 0.0032388
text1191 -0.0231752 -0.0194658 0.0656339 0.0108038
text1192 -0.0227851 -0.0136126 0.0451410 0.0034113
text1193 -0.0198902 -0.0151360 0.0360868 0.0067546
text1194 -0.0228518 -0.0143274 0.0424977 0.0054094
text1195 -0.0195070 -0.0174073 0.0392162 0.0043063
text1196 -0.0218024 -0.0146727 0.0400129 0.0066365
text1197 -0.0158582 -0.0147543 0.0377303 0.0030246
text1198 -0.0232767 -0.0152794 0.0389025 0.0046568
text1199 -0.0535793 -0.0365869 0.0983513 0.0174463
text1200 -0.0253214 -0.0127435 0.0333462 0.0033570
text1201 -0.0396566 -0.0300656 0.0737924 0.0151038
text1202 -0.0098603 -0.0060615 0.0162700 0.0029353
text1203 -0.0242161 -0.0247548 0.0577583 0.0068936
text1204 -0.0144860 -0.0100654 0.0240335 0.0032534
text1205 -0.0129762 -0.0059343 0.0117907 -0.0007517
text1206 -0.0113764 -0.0063911 0.0146978 0.0040480
text1207 -0.0102598 -0.0059454 0.0132757 0.0054319
text1208 -0.0173313 -0.0127241 0.0273363 0.0052954
text1209 -0.0133546 -0.0115297 0.0241664 0.0050609
text1210 -0.0124874 -0.0099186 0.0171910 0.0012146
text1211 -0.0110944 -0.0064187 0.0090577 0.0018846
text1212 -0.0040175 -0.0018344 0.0023126 -0.0002280
text1213 -0.0185136 -0.0030719 0.0264387 0.0148726
text1214 -0.0159681 -0.0057147 0.0195049 0.0110946
text1215 -0.0132846 -0.0078471 0.0112509 0.0070811
text1216 -0.0259760 -0.0141900 0.0243252 0.0053645
text1217 -0.0223435 -0.0125704 0.0208375 0.0016063
text1218 -0.0235111 -0.0133797 0.0268141 0.0079958
text1219 -0.0256192 -0.0080074 0.0307764 0.0099518
text1220 -0.0354537 -0.0196347 0.0526016 0.0044227
text1221 -0.0076849 -0.0044281 0.0065313 0.0026662
text1222 -0.0331396 -0.0191415 0.0480987 -0.0022103
text1223 -0.0151661 -0.0119982 0.0226967 -0.0031261
text1224 -0.0267368 -0.0213433 0.0396613 0.0074942
text1225 -0.0211184 -0.0108045 0.0263006 0.0066452
text1226 -0.0126821 -0.0097052 0.0191086 0.0068496
text1227 -0.0185914 -0.0167589 0.0322808 0.0072619
text1228 -0.0316281 -0.0239423 0.0446281 0.0112825
text1229 -0.0201399 -0.0165777 0.0390141 0.0103380
text1230 -0.0184581 -0.0108156 0.0234617 0.0076415
text1231 -0.0237889 -0.0157002 0.0282077 0.0117356
text1232 -0.0097602 -0.0092773 0.0174648 0.0051544
text1233 -0.0386951 -0.0187885 0.0905965 0.0210118
text1234 -0.0445560 -0.0346257 0.1287691 0.0256585
text1235 -0.0450785 -0.0316707 0.1044005 0.0273610
text1236 -0.0248358 -0.0102612 0.0404128 0.0062831
text1237 -0.0239457 -0.0109814 0.0187056 -0.0008420
text1238 -0.0241894 -0.0135012 0.0230123 0.0038015
text1239 -0.0056791 0.0004618 0.0034132 0.0012086
text1240 -0.0203604 -0.0127864 0.0261125 0.0077190
text1241 -0.0172091 -0.0058820 0.0180481 0.0003501
text1242 -0.0221294 -0.0080191 0.0197600 -0.0018498
text1243 -0.0148584 -0.0117424 0.0197293 0.0010147
text1244 -0.0155019 -0.0087822 0.0227989 0.0025723
text1245 -0.0031622 -0.0023090 0.0035140 0.0010601
text1246 -0.0169609 -0.0072296 0.0121245 0.0010414
text1247 -0.0152729 -0.0121264 0.0113229 0.0046816
text1248 -0.0235211 -0.0119081 0.0203465 -0.0015821
text1249 -0.0107376 -0.0043747 0.0084898 -0.0001421
text1250 -0.0202073 -0.0117020 0.0171877 0.0000047
text1251 -0.0161262 -0.0066680 0.0133748 -0.0015715
text1252 -0.0126374 -0.0109898 0.0169301 0.0022939
text1253 -0.0094042 -0.0062960 0.0063829 0.0012226
text1254 -0.0097084 -0.0055958 0.0051556 -0.0019282
text1255 -0.0121109 -0.0025475 0.0089829 -0.0010848
text1256 -0.0168702 -0.0075219 0.0238480 0.0064874
text1257 -0.0238096 -0.0161113 0.0331021 0.0016966
text1258 -0.0238813 -0.0115240 0.0260739 0.0015260
text1259 -0.0180455 -0.0110059 0.0193329 0.0028770
text1260 -0.0137165 -0.0092296 0.0142035 0.0010840
text1261 -0.0162348 -0.0105286 0.0211166 0.0001892
text1262 -0.0001354 -0.0002113 0.0005279 -0.0000709
text1263 -0.0213465 -0.0051867 0.0197088 0.0001245
text1264 -0.0184797 0.0006751 0.0139339 0.0029983
text1265 -0.0139337 -0.0078511 0.0118389 0.0029084
text1266 -0.0145363 0.0023955 0.0126649 0.0067576
text1267 -0.0231630 -0.0084126 0.0196143 0.0042069
text1268 -0.0186263 -0.0033988 0.0079080 -0.0013910
text1269 -0.0196692 -0.0003400 0.0089264 0.0017387
text1270 -0.0298738 -0.0060255 0.0145928 -0.0028674
text1271 -0.0255825 -0.0069126 0.0162168 0.0056273
text1272 -0.0183746 -0.0072260 0.0118741 0.0022930
text1273 -0.0224918 -0.0058505 0.0161267 0.0084872
text1274 -0.0187939 -0.0123916 0.0119268 -0.0000274
text1275 -0.0250872 -0.0154157 0.0147803 -0.0003747
text1276 -0.0084241 -0.0042067 0.0048075 0.0006028
text1277 -0.0266410 -0.0263991 0.0700499 0.0159152
text1278 -0.0235953 -0.0185272 0.0405770 0.0094306
text1279 -0.0353413 -0.0343207 0.0782522 0.0191723
text1280 -0.0243772 -0.0216084 0.0527765 0.0088372
text1281 -0.0218127 -0.0106865 0.0212158 -0.0005694
text1282 -0.0178868 -0.0063653 0.0170606 0.0000369
text1283 -0.0181415 -0.0060938 0.0161015 -0.0000281
text1284 -0.0196875 -0.0113688 0.0269542 0.0042661
text1285 -0.0224882 -0.0093405 0.0223077 0.0025204
text1286 -0.0123197 -0.0053514 0.0147080 0.0031464
text1287 -0.0295005 -0.0029757 0.0453603 0.0131348
text1288 -0.0228497 -0.0118634 0.0299708 0.0060066
text1289 -0.0239658 -0.0097514 0.0337060 0.0116057
text1290 -0.0167061 0.0004961 0.0088835 0.0050500
text1291 -0.0194456 -0.0114314 0.0230972 0.0059780
text1292 -0.0199051 -0.0065633 0.0276749 0.0065087
text1293 -0.0164042 -0.0089796 0.0250220 0.0043664
text1294 -0.0292774 -0.0259460 0.0636581 0.0151690
text1295 -0.0214477 -0.0112708 0.0429149 0.0117623
text1296 -0.0310994 -0.0175278 0.0720504 0.0157519
text1297 -0.0328056 -0.0313461 0.0822017 0.0179847
text1298 -0.0345539 -0.0252388 0.0733985 0.0120810
text1299 -0.0257661 -0.0198282 0.0459608 0.0083313
text1300 -0.0147166 -0.0090563 0.0233870 0.0062795
text1301 -0.0263456 -0.0156293 0.0255102 0.0005722
text1302 -0.0331501 -0.0167364 0.0434939 0.0041830
text1303 -0.0276590 -0.0128778 0.0334430 0.0068494
text1304 -0.0249355 -0.0151158 0.0301837 0.0039432
text1305 -0.0203486 -0.0148969 0.0205627 -0.0003576
text1306 -0.0199036 -0.0074581 0.0176603 0.0003516
text1307 -0.0141087 -0.0034968 0.0107500 0.0018270
text1308 -0.0110963 -0.0047028 0.0044790 0.0017730
text1309 -0.0148529 -0.0022568 0.0078490 0.0036055
text1310 -0.0163102 -0.0072631 0.0080936 0.0012271
text1311 -0.0190380 -0.0121390 0.0124352 -0.0004337
text1312 -0.0148717 -0.0060666 0.0131142 0.0016983
text1313 -0.0146381 -0.0066988 0.0142692 0.0050205
text1314 -0.0164621 -0.0049447 0.0100225 0.0025445
text1315 -0.0027845 -0.0011945 0.0010697 -0.0001716
text1316 -0.0220135 -0.0101771 0.0200871 0.0054067
text1317 -0.0191518 -0.0149883 0.0391383 0.0081219
text1318 -0.0125505 -0.0086241 0.0191920 0.0031596
text1319 -0.0158330 -0.0147393 0.0266364 0.0046906
text1320 -0.0162192 -0.0103692 0.0177857 0.0029235
text1321 -0.0023085 -0.0005858 0.0022416 0.0010583
text1322 -0.0285936 -0.0179461 0.0333453 0.0092403
text1323 -0.0334368 -0.0152622 0.0343416 0.0107335
text1324 -0.0325993 -0.0128034 0.0211156 -0.0007863
text1325 -0.0150414 -0.0048147 0.0114505 0.0037322
text1326 -0.0300606 -0.0097281 0.0596521 0.0098124
text1327 -0.0094376 -0.0008894 0.0117267 0.0007032
text1328 -0.0154628 -0.0104481 0.0215812 0.0058727
text1329 -0.0171304 -0.0075659 0.0115213 0.0060447
text1330 -0.0222916 -0.0071053 0.0118147 0.0058899
text1331 -0.0236364 -0.0145963 0.0350382 0.0061551
text1332 -0.0288623 -0.0229157 0.0721740 0.0151759
text1333 -0.0258637 -0.0138873 0.0433895 0.0110590
text1334 -0.0023720 -0.0012391 0.0008529 -0.0005517
text1335 -0.0156818 -0.0023940 0.0061661 0.0004349
text1336 -0.0134877 0.0049014 0.0054709 0.0066651
text1337 -0.0214546 -0.0074741 0.0090005 0.0100370
text1338 -0.0203300 -0.0108516 0.0054916 0.0151831
text1339 -0.0046561 0.0002105 0.0034609 0.0007325
text1340 -0.0208681 -0.0131218 0.0194035 0.0046350
text1341 -0.0099517 -0.0011642 0.0078330 0.0030831
text1342 -0.0274237 -0.0111208 0.0221667 0.0096249
text1343 -0.0104811 0.0034198 0.0047793 -0.0002065
text1344 -0.0183651 -0.0108382 0.0050208 0.0001450
text1345 -0.0168909 -0.0094141 0.0091423 0.0047790
text1346 -0.0101880 -0.0023519 0.0062977 -0.0001711
text1347 -0.0060476 -0.0025775 0.0003926 -0.0025569
text1348 -0.0199178 -0.0149241 0.0510025 0.0112930
text1349 -0.0207668 -0.0188404 0.0484058 0.0084296
text1350 -0.0250540 -0.0185392 0.0384077 0.0092102
text1351 -0.0155849 -0.0124551 0.0322648 0.0077263
text1352 -0.0091374 -0.0058421 0.0166754 0.0026362
text1353 -0.0262845 -0.0202025 0.0626862 0.0147606
text1354 -0.0121905 -0.0077810 0.0167179 0.0065283
text1355 -0.0175985 -0.0129004 0.0263908 0.0048875
text1356 -0.0181389 -0.0165556 0.0477068 0.0085469
text1357 -0.0180117 -0.0201083 0.0517978 0.0115285
text1358 -0.0314476 -0.0075955 0.0414872 0.0076994
text1359 -0.0190992 -0.0092232 0.0322768 0.0035414
text1360 -0.0255813 -0.0054276 0.0355909 0.0059357
text1361 -0.0274213 -0.0008209 0.0365301 0.0055831
text1362 -0.0086179 -0.0044916 0.0140517 0.0032662
text1363 -0.0176532 -0.0071213 0.0115839 0.0030242
text1364 -0.0160607 -0.0076934 0.0121910 0.0022804
text1365 -0.0196449 -0.0077007 0.0148938 0.0033588
text1366 -0.0143716 -0.0087438 0.0191285 0.0056031
text1367 -0.0196434 -0.0094279 0.0222537 0.0034856
text1368 -0.0086982 -0.0048728 0.0151825 0.0027900
text1369 -0.0045530 -0.0027982 0.0062111 0.0006941
text1370 -0.0219227 -0.0196646 0.0472492 0.0074813
text1371 -0.0208764 -0.0124188 0.0201754 0.0009754
text1372 -0.0045387 -0.0036443 0.0089064 0.0005189
text1373 -0.0279274 -0.0133612 0.0196186 0.0026106
text1374 -0.0185426 -0.0114527 0.0168776 0.0020348
text1375 -0.0186296 -0.0098700 0.0178121 0.0036093
text1376 -0.0256559 -0.0123573 0.0250706 0.0053335
text1377 -0.0341733 -0.0145620 0.0323973 0.0095320
text1378 -0.0013368 -0.0000752 0.0007296 -0.0003470
text1379 -0.0275008 -0.0186571 0.0350006 0.0157030
text1380 -0.0227336 -0.0122630 0.0312561 0.0074033
text1381 -0.0271434 -0.0119445 0.0270816 0.0070745
text1382 -0.0289924 -0.0193119 0.0501115 0.0134813
text1383 -0.0147569 -0.0085744 0.0235488 0.0052632
text1384 -0.0174883 -0.0082715 0.0119206 0.0071318
text1385 -0.0172456 -0.0076378 0.0140487 0.0036910
text1386 -0.0186192 -0.0100199 0.0154112 0.0053159
text1387 -0.0207576 -0.0109796 0.0239896 0.0121479
text1388 -0.0265772 -0.0073118 0.0224329 0.0154615
text1389 -0.0137747 -0.0027725 0.0091263 0.0036001
text1390 -0.0092731 -0.0030433 0.0046728 0.0026771
text1391 -0.0320523 -0.0344569 0.1407007 0.0186013
text1392 -0.0185421 -0.0243537 0.0970342 0.0165547
text1393 -0.0193515 -0.0220953 0.0762141 0.0093244
text1394 -0.0242607 -0.0313627 0.1174021 0.0181020
text1395 -0.0199014 -0.0195033 0.0611362 0.0101504
text1396 -0.0263855 -0.0275125 0.0847913 0.0150223
text1397 -0.0152298 -0.0237583 0.0745102 0.0106841
text1398 -0.0146808 -0.0170921 0.0511883 0.0065677
text1399 -0.0107480 -0.0099173 0.0198269 0.0067846
text1400 -0.0168060 -0.0170032 0.0804844 0.0108055
text1401 -0.0080115 -0.0059210 0.0181123 0.0012585
text1402 -0.0172274 -0.0193586 0.0644531 0.0112668
text1403 -0.0075528 -0.0063734 0.0195733 0.0043290
text1404 -0.0241621 -0.0196769 0.0095680 0.0134103
text1405 -0.0192720 -0.0172307 0.0120371 0.0077019
text1406 -0.0201197 -0.0140707 0.0166962 0.0091761
text1407 -0.0256287 -0.0199207 0.0216180 0.0112929
text1408 -0.0198045 -0.0180369 0.0357508 0.0090264
text1409 -0.0214027 -0.0133113 0.0381165 0.0080108
text1410 -0.0099786 -0.0057152 0.0147671 0.0014337
text1411 -0.0388965 -0.0293422 0.0615207 0.0078692
text1412 -0.0165393 -0.0122713 0.0248021 0.0062065
text1413 -0.0253738 -0.0132514 0.0361920 0.0095223
text1414 -0.0209431 -0.0159577 0.0362313 0.0042816
text1415 -0.0236067 -0.0173798 0.0272316 0.0035329
text1416 -0.0368629 -0.0263240 -0.0513811 -0.0950757
text1417 -0.0259037 -0.0219388 -0.0167730 -0.0547199
text1418 -0.0368457 -0.0280660 -0.0328888 -0.0797909
text1419 -0.0361311 -0.0085427 -0.0430633 -0.0817451
text1420 -0.0071000 -0.0051373 -0.0085707 -0.0121793
text1421 -0.0368629 -0.0263240 -0.0513811 -0.0950757
text1422 -0.0259037 -0.0219388 -0.0167730 -0.0547199
text1423 -0.0368457 -0.0280660 -0.0328888 -0.0797909
text1424 -0.0361311 -0.0085427 -0.0430633 -0.0817451
text1425 -0.0071000 -0.0051373 -0.0085707 -0.0121793
text1426 -0.0136770 -0.0056069 0.0091418 0.0027283
text1427 -0.0234082 -0.0096756 0.0297016 0.0122550
text1428 -0.0201513 -0.0022685 0.0193452 0.0033832
text1429 -0.0109535 -0.0037946 0.0085212 -0.0008822
text1430 -0.0246360 -0.0213953 0.0665742 0.0141814
text1431 -0.0216925 -0.0162035 0.0521358 0.0117723
text1432 -0.0221570 -0.0197074 0.0270826 0.0183850
text1433 -0.0085360 -0.0061643 0.0138918 0.0040306
text1434 -0.0290395 -0.0183560 0.0358862 0.0094803
text1435 -0.0128230 -0.0091225 0.0188358 0.0049251
text1436 -0.0226302 -0.0146832 0.0381589 0.0072173
text1437 -0.0226699 -0.0110577 0.0294877 0.0063232
text1438 -0.0343363 -0.0083079 0.0286105 0.0070472
text1439 -0.0257373 -0.0141723 0.0234504 0.0007082
text1440 -0.0392825 -0.0226554 0.0629511 0.0055264
text1441 -0.0172997 -0.0113387 0.0245082 0.0021510
text1442 -0.0250266 -0.0163937 0.0349318 0.0082551
text1443 -0.0233034 -0.0117810 0.0307270 0.0028201
text1444 -0.0262489 -0.0152228 0.0354920 0.0063862
text1445 -0.0324450 -0.0175934 0.0350503 0.0038955
text1446 -0.0303233 -0.0159270 0.0375620 0.0043827
text1447 -0.0141617 -0.0080686 0.0181165 0.0013674
text1448 -0.0155422 -0.0057841 0.0114764 0.0041780
text1449 -0.0252204 -0.0154459 0.0254361 0.0061764
text1450 -0.0177621 -0.0080365 0.0234353 0.0076322
text1451 -0.0179631 -0.0098150 0.0208695 0.0018190
text1452 -0.0081377 -0.0039232 0.0077366 0.0017474
text1453 -0.0127695 -0.0093685 0.0206262 0.0038767
text1454 -0.0210029 -0.0148554 0.0205513 0.0027548
text1455 -0.0111173 -0.0100176 0.0172857 0.0038602
text1456 -0.0176743 -0.0147119 0.0336330 0.0073582
text1457 -0.0038202 -0.0033638 0.0078626 0.0019853
text1458 -0.0214738 -0.0107076 0.0219580 0.0078269
text1459 -0.0238631 -0.0169482 0.0258979 0.0073586
text1460 -0.0157018 -0.0087490 0.0120742 -0.0013495
text1461 -0.0316158 -0.0141059 0.0248640 0.0030479
text1462 -0.0225950 -0.0069671 0.0189227 -0.0013091
text1463 -0.0067600 -0.0036397 0.0049023 0.0002011
text1464 -0.0127280 -0.0025830 0.0106006 0.0011827
text1465 -0.0290737 -0.0100324 0.0204337 0.0101260
text1466 -0.0162131 -0.0089068 0.0189332 0.0077427
text1467 -0.0222891 -0.0097112 0.0177865 0.0055085
text1468 -0.0480732 -0.0210247 0.0152098 -0.0214661
text1469 -0.0029752 -0.0017168 0.0042393 0.0007534
text1470 -0.0296134 -0.0113073 0.0301954 -0.0039040
text1471 -0.0101514 -0.0033470 0.0048832 -0.0014561


This Topic strength table represents the strength of each dimension. For example, dimension 4 has the smallest strength.

Topic_Strength2 <- data.frame(CN,TED.lsa2$sk)
colnames(Topic_Strength2) <- c("Dimension", "Topic strength")

head(Topic_Strength2) %>% flextable() %>% add_header_lines("Topic strength by dimension") %>% autofit()


This Terms-topic sim. table shows the link between each term and each dimension. For example, the terms “artificial” and “intelligence” are both most relevant to dimension 3.

kable(head(TED.lsa2$features,10), 
      col.names = c("dimension1","dimension2","dimension3","dimension4"),
      caption = "Terms-topic sim.(LSA on TF-IDF)") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
Terms-topic sim.(LSA on TF-IDF)
dimension1 dimension2 dimension3 dimension4
today -0.0536687 0.0095691 -0.0068271 -0.0241286
artificial -0.0415968 -0.0378650 -0.0482462 -0.0482060
intelligence -0.0645813 -0.0679693 -0.0779669 -0.0735150
help -0.0327216 -0.0080277 0.0110845 -0.0074941
doctor -0.0239033 -0.0187716 0.0082665 -0.0113366
diagnose -0.0133458 -0.0141806 -0.0075820 -0.0165815
patient -0.0277309 -0.0236963 -0.0039323 -0.0299937
pilot -0.0067898 0.0018689 -0.0036318 0.0001526
fly -0.0204832 -0.0025275 -0.0187381 0.0240834
commercial -0.0072157 0.0025301 -0.0057654 -0.0052020


We also would like to see the top words for dimension2, 3, and 4 of LSA on TF-IDF.

## For Dimension 2
w2.order <- sort(TED.lsa2$features[, 2],decreasing = TRUE)
w2.top.d2 <- c(w2.order[1:n.terms],rev(rev(w2.order)[1:n.terms]))
## For Dimension 3
w2.order <- sort(TED.lsa2$features[, 3], decreasing = TRUE)
w2.top.d3 <- c(w2.order[1:n.terms], rev(rev(w2.order)[1:n.terms]))
## For Dimension 4
w2.order <- sort(TED.lsa2$features[, 4], decreasing = TRUE)
w2.top.d4 <- c(w2.order[1:n.terms], rev(rev(w2.order)[1:n.terms]))

w2.top.d2 <- t(w2.top.d2)
kable(w2.top.d2, 
      caption = "Top 5 and bottom 5 of Dimension2 (LSA on TF-IDF)") %>%
   kable_paper()
Top 5 and bottom 5 of Dimension2 (LSA on TF-IDF)
forest carbon climate emission energy human computer machine ai robot
0.2176291 0.2084673 0.1770268 0.1768026 0.1434831 -0.0782871 -0.0820833 -0.0828424 -0.1643167 -0.2829436


For this LSA, dimension 2 is associated positively with “forest”, “carbon”, “climate”, “emission” ,“energy”, and negatively associated with “human”, “computer”, “machine”, “ai”, “robot”.

w2.top.d3 <- t(w2.top.d3)
kable(w2.top.d3, 
      caption = "Top 5 and bottom 5 of Dimension3 (LSA on TF-IDF)") %>%
   kable_paper()
Top 5 and bottom 5 of Dimension3 (LSA on TF-IDF)
regret sex woman love man datum machine rule ai robot
0.2568698 0.2312091 0.1600525 0.1539985 0.1155972 -0.0791178 -0.0826875 -0.0872609 -0.254843 -0.4413087


Dimension 3 is associated positively with “regret”, “sex”, “woman”, “love” ,“man”, and negatively associated with “datum”, “machine”, “rule”, “ai”, “robot”.

w2.top.d4 <- t(w2.top.d4)
kable(w2.top.d4, 
      caption = "Top 5 and bottom 5 of Dimension4 (LSA on TF-IDF)") %>%
   kable_paper()
Top 5 and bottom 5 of Dimension4 (LSA on TF-IDF)
robot rule bee seaweed coral machine human company datum ai
0.6214054 0.1322063 0.1247338 0.1094784 0.1068444 -0.0782485 -0.0878439 -0.1099502 -0.146753 -0.4063409


Dimension 4 is associated positively with “robot”, “rule”, “bee”, “seaweed” ,“coral”, and negatively associated with “machine”, “human”, “company”, “datum”, “ai”.

Afterwards, we would like to see the relation between this LSA result and category of text. Therefore, we combine the LSA result with the category of text and represent every text on these two following plots.

TED.lsa2.source <- TED_full %>% 
  select(2) %>% cbind(as.data.frame(TED.lsa2$docs))

LSA_p3 <- ggplot(data=TED.lsa2.source,mapping = aes(
  x=V2,
  y=V3,
  color=cate))+
  geom_point()+
  labs(x = "dimension2",
       y = "dimension3",
       title = "Distribution of texts in different categories",
       subtitle = "LSA(TF-IDF) dimension 2 and 3")+
  scale_colour_discrete(
    name="Category",
    breaks=c("1","2","3"),
    labels=c("AI","Climate change","Relationships"))+
  theme(plot.title = element_text(size = 12))

LSA_p4 <- ggplot(data=TED.lsa2.source,mapping = aes(
  x=V3,
  y=V4,
  color=cate))+
  geom_point()+
  labs(x = "dimension3",
       y = "dimension4",
       title = "Distribution of texts in different categories",
       subtitle = "LSA(TF-IDF) dimension 3 and 4")+
  scale_colour_discrete(
    name="Category",
    breaks=c("1","2","3"),
    labels=c("AI","Climate change","Relationships"))+
  theme(plot.title = element_text(size = 12))

(LSA_p3+LSA_p4)+
  plot_layout(guides = "collect") & theme(legend.position = 'bottom')

From the left hand side plot:x-axis represents dimension 2 and y-axis represents dimension3. According to this plot, the “Climate change” category is positively associated with dimension2 while, the “Relationships” category is positively associated with dimension3. Moreover, the “AI” category is negatively associated with dimension2 and dimension3.

From the right hand side plot:x-axis represents dimension 3 and y-axis represents dimension4. According to this plot, It appears that a significant portion of the texts in the “AI” category are associated with dimension 4, although there is some variability in the direction of this association. It is not immediately clear from this plot alone what may be driving this pattern

6.2 LDA

We now turn to Latent Dirichlet Association (LDA). LDA is a Bayesian model for topic modeling: generative model. It is also to discover topics in a collection of documents. For the illustration, we will perform 4 topics again.

TED.LDA <- LDA(
  convert(TED.dfm, to = "topicmodels"),
  k = 4,
  control = list(seed = 123))

First, we examine the top 5 words in each dimension. For example, the top 5 terms for topic 1 are “climate”, “year”, “make”, “change” and “energy”.

top5lda <- as.data.frame(topicmodels::terms(TED.LDA, 5))

head(top5lda) %>% flextable() %>% add_header_lines("Top5 terms for each topic") %>% autofit()


In addition, we can observe that Dimension1 has the highest topic strength.

colnames(Topic_Strength2) <- c("Dimension", "Topic strength")

head(Topic_Strength2) %>% flextable() %>% add_header_lines("Topic strength by dimension") %>% autofit()


Then, we create a table to show the number of documents in each dimension. For example, topic 3 has the highest number of documents (439 documents).

ldatable <- as.data.frame(topicmodels::topics(TED.LDA) %>% table())
colnames(ldatable) <- c("Topic", "Number of documents")

head(ldatable) %>% flextable() %>% add_header_lines("Top5 terms for each topic") %>% autofit()


Subsequently, we applied the topic_diagnostics function to diagnose the prominence, coherence and exclusivity of each dimension.

td <- topic_diagnostics(
  topic_model = TED.LDA, 
  dtm_data = convert(TED.dfm, to = "topicmodels"))

kable(td,
      caption = "Topic diagnostics")%>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
Topic diagnostics
topic_num topic_size mean_token_length dist_from_corpus tf_df_dist doc_prominence topic_coherence topic_exclusivity
1 3383.187 5.6 0.4216740 19.70057 418 -86.57526 8.904578
2 3644.634 5.1 0.3793068 22.85747 541 -73.97054 8.708857
3 4124.771 5.2 0.3999282 21.41908 574 -77.63556 8.381961
4 3892.408 4.4 0.4113173 21.01260 435 -67.67438 8.289089


Based on the analysis conducted, it appears that Topic 3 has the highest prominence among the identified topics. Additionally, the coherence of Topic 4 is found to be the highest, while the coherence of Topic 1 is the lowest. In terms of exclusivity, it is observed that Topic 1 has the highest exclusivity, while Topic 4 has the lowest exclusivity. These findings suggest that the characteristics of the identified topics vary in terms of their prominence, coherence, and exclusivity.

beta.long <- tidy(
  TED.LDA,
  matrix = "beta") # equivalent to melt (with this package)

beta.long %>% 
  group_by(topic) %>% 
  top_n(15, beta) %>% 
  ggplot(aes(reorder_within(term, beta, topic), beta)) + 
  geom_col(show.legend = FALSE) +
  ggtitle("Topic-term probabilities (Phi's)") +
  coord_flip()+
  facet_wrap(~ topic, scales = "free_y") +
  scale_x_reordered() + 
  xlab("Term") +
  theme(
    axis.text.y = element_text(size = 8),
    axis.text.x = element_text(size = 8),
    strip.text = element_text(size = 8))

Topic1 focus on “climate”, “change”, “energy”, “water”. Topic2 focus on “people”, “ai”, “work”, “technology”. Topic3 focus on “love”, “life”, “woman”, “relationship”. Topic4 focus on “robot”, “thing”, “brain”, “human”.

document <- rownames(TED.lsa.source)
TED.lsa.source <- cbind(document,TED.lsa.source)

gamma.long <- tidy(TED.LDA,matrix = "gamma") %>% 
  right_join(TED.lsa.source[1:2],by = "document")

gamma.long$cate<-factor(gamma.long$cate,
                       levels = c('1','2','3'),
                       labels = c("AI","Climate change","Relationships"))

gamma.long %>% ggplot(mapping = aes(x=document,y=gamma,fill=cate))+
  ggtitle("Topic-Document probabilities (Theta’s)") +
  geom_col()+
  coord_flip() + 
  facet_wrap(~topic,ncol = 4)

The charts above show that the Climate change related documents mainly talk about Topic 1 while, the Relationships related documents mainly talk about Topic3. In addition, the AI related documents mainly talk about Topic 2 and Topic4.

7. Embedding

In addition to using Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA), we also plan to utilize word and document embeddings to analyze the transcripts of TED videos. Embedding refers to the representation of elements (e.g., documents or tokens) in a vector space model, where words are first embedded and then document embeddings are constructed that capture the co-occurrence patterns of the words within the document.

7.1 Word Embedding

The objective of word embedding is to learn a representation of words that reflects their co-occurrence patterns. To obtain these co-occurrence patterns, we utilize the fcm function from the quanteta package.

Below, we present a sample of the resulting co-occurrence data. As can be seen, the co-occurrence between the words artificial and intelligence is relatively high (170), while the co-occurrence between the words fly and intelligence is considerably lower (0). These patterns are indicative of the relationships between the words in the corpus and will be useful in constructing the word embeddings.

TED.coo <- fcm(TED.tk,
               context = "window",
               window = 5,
               tri = FALSE) 

TC <- head(TED.coo) %>% 
  convert(to="data.frame") %>% 
  select(1:6)

kable(TC,
      caption = "Co-occurrence matrix") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
Co-occurrence matrix
doc_id today artificial intelligence diagnose fly
today 28 8 8 4 5
artificial 8 30 170 2 2
intelligence 8 170 78 1 0
diagnose 4 2 1 0 1
fly 5 2 0 1 8
commercial 0 0 0 1 2
# RcppParallel::setThreadOptions(1)
set.seed(123)
p <- 2 # word embedding dimension
TED.glove <- GlobalVectors$new(rank = p,
                               x_max = 10) # x_max is a needed technical option
TED.we <- TED.glove$fit_transform(TED.coo, n_threads = 1) # central vectors; speech.glove$components contains the context vectors
TED.we <- t(TED.glove$components) + TED.we# unique representation

In order to visualize the learned word embeddings, we create two plots. The first plot depicts the vectors of the 100 most frequently used words (i.e., the 100 words with the largest frequencies). The second plot shows all of the words, but only label a subset of them.

index <- textstat_frequency(dfm(TED.tk))[1:100, ]$feature
## words with the 100 largest frequencies

data.for.plot <- data.frame(TED.we[index, ])
data.for.plot$word <- row.names(data.for.plot)

Emb_p1 <- ggplot(data.for.plot, 
       aes(x = X1,
           y = X2,
           label = word)) +
  geom_text_repel(max.overlaps = 100)+
  theme_void() +
  labs(title="Map of top100 words")

TED.we.df <- as.data.frame(TED.we)
word <- rownames(TED.we.df)
TED.we.df <- cbind(word,TED.we.df)
e <- c(1:15045)
row.names(TED.we.df) <- e

Emb_p2 <- ggplot(TED.we.df,aes(x=V1,y=V2))+
  geom_text_repel(data = subset(TED.we.df, V1 <=-1.8|V2>3|V1>2),
            mapping = aes(label = word),
            hjust = "inward",
            max.overlaps = 100) +
  geom_point(color="grey")+
  labs(title="Map of all words(partially labeled)")

Emb_p1

Emb_p2

To avoid label overlap between data points in the plots, we use the geom_text_repel function. Some labels are also accompanied by a black line, indicating the location of the corresponding data point.

The first plot depicts the relationships between frequently used words. It can be seen that words that are close in the embedding space are often used together. For example, the words “robot” and “computer” are close, indicating that they are frequently used together. Similarly, the words “man” and “woman” are close, suggesting that they are also commonly used together.

The second plot presents the distribution of all used words, with a subset labeled for illustration. This plot shows that words such as “carbon” and “emission” are close in the embedding space, indicating that they are often used together.

7.2 Document Embedding

We now build the document embedding by computing the centroids of the documents.

kable(TED.we[TED.tk[[1]], ],
      col.names = c("dimension1","dimension2"),
      caption = "Word vectors") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
Word vectors
dimension1 dimension2
today -0.7912934 2.2696258
artificial 1.0067433 1.3543759
intelligence 1.4609250 1.7702632
help 0.2675278 1.2395821
doctor -0.1246391 0.6178340
diagnose -0.6098596 -0.0223995
patient -0.4179482 0.5169117
pilot 0.3673782 -0.3877739
fly 0.4031451 0.3916120
commercial 0.5305939 -0.2669108
aircraft 0.6318098 -0.5813117
city -1.0507722 1.4355099
planner -0.6161024 -0.7257687
predict -0.1915459 0.9986805
traffic -0.4747728 -0.0681975
matter 0.3644152 1.3631938
ais 0.5050700 0.0019643
computer 1.6623174 2.2142813
scientist 0.0252842 1.1262502
design 0.5685137 1.5617515
artificial 1.0067433 1.3543759
intelligence 1.4609250 1.7702632
self-taught -0.0712951 -0.1945536
work 0.3632400 3.7634488
simple 0.3471166 1.2454114
set 0.2194668 1.3799813
instruction 0.3630714 -0.3031281
create 0.0601281 2.3934684
unique -0.1438308 0.4274753
array -0.0063147 -0.2812444
rule 1.6917308 0.7052268
strategy -0.2310618 0.1535506
machine 1.5376507 2.0500463
learn 1.3605571 2.6998712
way 0.0135613 1.7143029
build -0.0813452 2.5828743
self-teaching 0.0373505 -0.2714614
program 0.9333572 0.9880853
rely -0.3880698 -0.1240620
basic -0.0144337 0.5152165
type 0.5143821 0.9179712
machine 1.5376507 2.0500463
learn 1.3605571 2.6998712
unsupervised 0.3365580 -0.1560786
learn 1.3605571 2.6998712
supervise 0.6361466 -0.0076004
learn 1.3605571 2.6998712
reinforcement 0.9111989 -0.1725352
learn 1.3605571 2.6998712
action 0.3005430 1.2698172
imagine 0.4153269 1.8316738
researcher 0.4291195 0.6552271
pull -0.7095692 0.4550206
information 0.4455145 1.0859653
set 0.2194668 1.3799813
medical 0.1219865 0.4241029
datum 0.2038941 2.1394018
thousand -0.6327201 1.3977847
patient -0.4179482 0.5169117
profile 0.0894409 -0.2019660
unsupervised 0.3365580 -0.1560786
learn 1.3605571 2.6998712
approach 0.1099199 0.7501038
ideal 0.5859434 0.2300753
analyze 0.0304432 0.1671292
profile 0.0894409 -0.2019660
find 0.3193917 2.8885411
general 0.6306511 0.6605379
similarity -0.0457993 -0.2599420
pattern 1.1032917 0.7568950
patient -0.4179482 0.5169117
similar 0.2716809 0.4693967
disease -1.2584348 0.2680581
presentation 0.1299224 -0.6669214
treatment -0.5430676 -0.2346442
produce -0.7241987 0.7394374
specific -0.0210384 0.3990485
set 0.2194668 1.3799813
side 0.6128828 1.1087972
effect -0.9673782 0.7019542
broad -0.2349462 0.0748025
pattern-seeking 0.0774302 0.3327800
approach 0.1099199 0.7501038
identify 0.7154093 0.6519831
similarity -0.0457993 -0.2599420
patient -0.4179482 0.5169117
profile 0.0894409 -0.2019660
find 0.3193917 2.8885411
emerge 0.7033963 0.1742021
pattern 1.1032917 0.7568950
human 0.9943274 3.4300730
guidance 0.3770970 -0.6781217
imagine 0.4153269 1.8316738
doctor -0.1246391 0.6178340
specific -0.0210384 0.3990485
physician 0.5429869 -0.8243942
create 0.0601281 2.3934684
algorithm 1.4194495 1.2633921
diagnose -0.6098596 -0.0223995
condition -0.2835306 0.5624355
begin 0.4175637 1.6990697
collect -0.1396974 0.5342967
set 0.2194668 1.3799813
datum 0.2038941 2.1394018
medical 0.1219865 0.4241029
image 1.0578542 1.0123109
test 1.0824076 0.6764967
result 0.4716050 0.9866008
healthy 0.0513653 0.6588220
patient -0.4179482 0.5169117
diagnose -0.6098596 -0.0223995
condition -0.2835306 0.5624355
input 0.9993146 -0.1565847
datum 0.2038941 2.1394018
program 0.9333572 0.9880853
design 0.5685137 1.5617515
identify 0.7154093 0.6519831
feature 0.7682994 -0.3015804
share 0.1201381 1.7193703
sick -0.0710369 0.1141436
patient -0.4179482 0.5169117
healthy 0.0513653 0.6588220
patient -0.4179482 0.5169117
base 0.6623085 1.1142376
frequently 0.7311059 -0.5923912
see 0.5211378 0.1777288
feature 0.7682994 -0.3015804
program 0.9333572 0.9880853
assign 0.0701098 -0.4651571
value 0.2840550 0.6080404
feature 0.7682994 -0.3015804
diagnostic 0.4830349 -0.3460687
significance -0.0003362 -0.1787251
generate 0.4755530 0.6925221
algorithm 1.4194495 1.2633921
diagnose -0.6098596 -0.0223995
future 0.2854957 2.5568317
patient -0.4179482 0.5169117
unlike 0.4482617 -0.2473963
unsupervised 0.3365580 -0.1560786
learn 1.3605571 2.6998712
doctor -0.1246391 0.6178340
computer 1.6623174 2.2142813
scientist 0.0252842 1.1262502
active 0.4401833 0.0624633
role 0.3535156 0.7035888
doctor -0.1246391 0.6178340
make -0.3387795 4.2717942
final 0.2605484 0.2182287
diagnosis -0.3617847 -0.2187834
check 0.0073682 0.5853721
accuracy -0.6851635 0.3358437
algorithm’s 0.5663473 -0.2430070
prediction 0.7576166 0.5410350
computer 1.6623174 2.2142813
scientist 0.0252842 1.1262502
update 0.0653448 -0.4110671
dataset 0.0795133 -0.6244753
adjust 0.5504651 -0.4596838
program’s -0.1890760 -0.3196994
parameter -0.0893403 -0.2590581
improve 0.5783916 0.9269032
accuracy -0.6851635 0.3358437
hands-on 0.0435083 -0.4467440
approach 0.1099199 0.7501038
call 0.3991184 2.7563847
supervise 0.6361466 -0.0076004
learn 1.3605571 2.6998712
doctor -0.1246391 0.6178340
design 0.5685137 1.5617515
algorithm 1.4194495 1.2633921
recommend 1.0661886 -0.1029762
treatment -0.5430676 -0.2346442
plan 0.0520563 0.9041258
plan 0.0520563 0.9041258
implement -0.2902355 0.0323296
stage 0.0803133 0.6624616
change -1.1257447 2.7984912
depend -0.0660032 0.4220400
individual’s -0.4089556 -0.4474892
response 0.7583162 0.6301635
treatment -0.5430676 -0.2346442
doctor -0.1246391 0.6178340
decide 0.5118778 1.4046306
reinforcement 0.9111989 -0.1725352
learn 1.3605571 2.6998712
program 0.9333572 0.9880853
iterative 0.3377259 -0.3088869
approach 0.1099199 0.7501038
gather 0.1792741 0.1383092
feedback 0.8137029 0.0992720
medication 0.0824420 -0.6377832
dosage -0.0692818 -0.6007098
treatment -0.5430676 -0.2346442
effective 0.0328664 0.5961696
compare -0.1354502 0.2791782
datum 0.2038941 2.1394018
patient’s 0.0093048 -0.6616511
profile 0.0894409 -0.2019660
create 0.0601281 2.3934684
unique -0.1438308 0.4274753
optimal 0.2819123 -0.6465306
treatment -0.5430676 -0.2346442
plan 0.0520563 0.9041258
nd <- length(TED.tk) # number of documents
TED.de <- matrix(nr = nd, nc = p) # document embedding matrix (1 document per row)
for(i in 1:nd) {
  words_in_i <- TED.we[TED.tk[[i]], , drop = FALSE] 
  # drop = FALSE is needed in case there is only one token
  TED.de[i, ] <- apply(words_in_i, 2 ,mean)
}
row.names(TED.de) <- names(TED.tk)

kable(TED.de,
      col.names = c("dimension1","dimension2"),
      caption = "Document vectors") %>%
   kable_paper() %>%
   kableExtra::scroll_box(width = "100%", height = "200px")
Document vectors
dimension1 dimension2
text1 0.2709277 0.7512324
text2 0.4151799 0.9684588
text3 0.2466364 1.1242400
text4 0.2350248 1.2377504
text5 0.3214014 1.0811997
text6 0.4063020 1.3325672
text7 0.3395273 1.1668672
text8 0.3397086 1.0941657
text9 0.4061817 1.3347125
text10 0.5976759 1.2912744
text11 0.2077112 1.0248086
text12 0.4058615 1.4216218
text13 0.0102016 1.3608191
text14 0.1526326 1.6346111
text15 -0.1773134 2.0416765
text16 0.0832803 1.2331251
text17 0.0356698 1.4180406
text18 0.0965144 1.3687565
text19 0.0802633 1.3606523
text20 0.1609767 1.3690090
text21 -0.0568655 1.1625115
text22 0.2514714 0.9948947
text23 0.0810500 1.1851033
text24 0.2712057 1.1956260
text25 0.2505008 1.3123099
text26 0.0218394 1.3720004
text27 -0.1313788 1.5563885
text28 0.0531998 0.8575422
text29 0.3803628 1.0799746
text30 -0.0481467 1.1349954
text31 0.1852120 1.0219946
text32 0.1814545 1.6967918
text33 -0.3441812 1.5052300
text34 0.2972337 1.1990413
text35 0.1755490 1.1532285
text36 0.3466876 1.3990078
text37 0.2068691 1.2041732
text38 0.2073838 0.7832560
text39 0.1187597 1.1018392
text40 0.1924638 1.0854960
text41 0.3033122 1.1510831
text42 0.2572423 0.5792655
text43 0.2237428 0.8946439
text44 0.1151908 0.6711979
text45 0.0339561 1.1270306
text46 0.1105561 0.6767930
text47 0.1828900 1.0852696
text48 -0.0693717 0.7751755
text49 0.1696723 1.2589084
text50 0.3255871 1.3243870
text51 -0.0012330 0.7145463
text52 -0.0376210 0.7351326
text53 0.1448720 0.8655408
text54 0.1124782 0.9487589
text55 0.1002806 0.7700794
text56 -0.0130286 1.0954547
text57 0.3619427 0.1875556
text58 0.1043440 0.8675636
text59 0.1932323 1.0318282
text60 -0.0681551 1.3731357
text61 -0.4131275 0.7077027
text62 -0.5481259 1.2803576
text63 -0.3548012 1.1611471
text64 0.2703442 1.0266700
text65 0.0953760 0.9062987
text66 0.0203683 1.2597137
text67 0.2245822 1.2944582
text68 0.1246039 1.3714698
text69 -0.1812595 1.3829592
text70 0.2437340 0.8618678
text71 0.0522071 0.9952718
text72 0.2914456 1.0329528
text73 -0.0625922 0.9278840
text74 -0.5111175 0.7679292
text75 -0.0861332 0.9035537
text76 -0.1262422 1.1799738
text77 -0.0701273 1.2536934
text78 -0.2178672 0.9053597
text79 -0.2849598 1.0058049
text80 -0.2933457 1.0926894
text81 -0.2244763 1.0699614
text82 -0.0474787 1.0231063
text83 -0.0871767 1.2890897
text84 -0.0280496 1.0319476
text85 0.2642792 1.1846999
text86 0.1170309 1.0970479
text87 0.2122495 0.8796349
text88 0.3853238 1.5580005
text89 0.2436493 0.5488286
text90 0.2714218 0.7879432
text91 0.0999587 1.0878410
text92 0.1251373 0.9543185
text93 0.0397107 0.5106452
text94 -0.0076993 0.5760305
text95 0.0081890 0.8528156
text96 0.5434075 0.9400085
text97 0.6161010 1.0904271
text98 0.4803582 0.8790543
text99 0.4503665 1.1730768
text100 0.3514449 1.4753230
text101 0.1318320 1.5145699
text102 0.4074807 1.2124295
text103 0.3013044 1.1159797
text104 0.0770755 1.0393423
text105 0.1611375 1.1901817
text106 0.0175627 1.0344855
text107 0.1047148 0.8448573
text108 0.0644890 0.7917805
text109 -0.0912438 1.0375242
text110 -0.1351287 1.0885113
text111 0.0658714 1.0648468
text112 0.0441942 0.9825123
text113 0.2740752 1.2393825
text114 0.1224984 1.5163632
text115 -0.1078071 0.7936114
text116 -0.1574545 0.6537182
text117 0.1957324 0.4252172
text118 0.0748250 1.0459668
text119 0.1093269 0.8196253
text120 0.1383086 0.7288616
text121 0.1912517 0.7866500
text122 0.0848710 1.7019444
text123 0.2147225 0.9324179
text124 0.1157038 0.7881865
text125 0.1246754 0.7566726
text126 0.3882140 1.5316885
text127 -0.0100711 0.4875858
text128 -0.1540917 0.1608165
text129 0.1779021 0.9350534
text130 0.2086714 0.6303924
text131 0.1669246 0.6325451
text132 -0.0519129 1.1983794
text133 0.1445187 1.1996271
text134 0.1562267 1.0974519
text135 -0.0579134 0.9979655
text136 -0.0626077 1.2056114
text137 -0.0217871 0.6219046
text138 0.1122585 0.8775574
text139 0.0856921 0.8911775
text140 0.2077350 0.8523821
text141 0.1900198 0.5331519
text142 -0.3914593 0.6148101
text143 -0.0556648 0.8829033
text144 0.1542887 1.1887418
text145 0.0523112 0.6146210
text146 0.0859265 0.7865136
text147 -0.0830549 0.4426949
text148 0.1040254 1.1487754
text149 0.1801549 0.9910604
text150 0.1390815 0.9017739
text151 0.1258939 1.0939036
text152 0.0505367 1.1504094
text153 0.1816624 1.1482463
text154 -0.4005042 0.8106614
text155 0.2013444 1.1891805
text156 0.3633522 1.2875965
text157 0.1411461 1.2763875
text158 0.1559586 0.4986808
text159 -0.2108517 0.9893826
text160 -0.2650462 0.7530995
text161 0.1502474 0.7956294
text162 0.3528285 0.5027114
text163 0.1959602 0.6507239
text164 0.2362834 0.8251725
text165 -0.0145254 0.7120742
text166 0.1086890 0.7660944
text167 0.2834112 0.9862665
text168 0.5076028 0.4621613
text169 -0.2126588 1.0505764
text170 -0.1866934 0.8540100
text171 0.1825481 0.9811926
text172 0.0233071 1.2509559
text173 0.0846573 1.1670568
text174 0.1599727 1.1635874
text175 0.1333300 0.7401813
text176 0.2837345 0.6487392
text177 0.3913657 0.9499424
text178 0.3229234 1.5951866
text179 0.2316253 1.1503358
text180 0.0854662 0.7609706
text181 -0.0794472 0.8226630
text182 0.1621554 1.3364387
text183 0.1020109 0.7580583
text184 0.1175607 0.8773887
text185 0.1115607 0.9895646
text186 0.1666058 0.9576314
text187 0.1287587 2.2944172
text188 0.3373275 0.8839805
text189 0.1548608 0.8371301
text190 0.2614703 1.0915145
text191 0.0267963 0.6603627
text192 0.2681406 1.1133805
text193 0.3167654 0.9594795
text194 0.2279724 1.0033087
text195 0.3942465 0.8315348
text196 0.4960703 1.2725817
text197 0.2007681 0.9297618
text198 -0.0596819 1.1449072
text199 0.0090280 1.0414777
text200 0.0669971 1.0212119
text201 0.1150198 0.9842693
text202 0.0047369 0.7914707
text203 0.2240330 1.0295411
text204 0.2599537 1.1659764
text205 -0.0407968 0.9627442
text206 0.0012679 1.1381714
text207 0.1221252 1.0468251
text208 0.1359873 1.3657840
text209 0.1018321 0.9303884
text210 -0.1290664 0.6493871
text211 -0.0882435 0.8777495
text212 0.1873980 0.8977767
text213 0.3554466 1.2288111
text214 0.1155648 0.8141945
text215 0.3829716 1.0164283
text216 0.4023669 0.9236835
text217 -0.0532139 0.8459739
text218 0.2246705 1.0403599
text219 0.0655214 0.7716609
text220 0.0765975 0.9318347
text221 0.0916586 1.7415424
text222 0.3251531 1.4099981
text223 0.5508282 1.5159941
text224 0.4600882 1.0329308
text225 0.2577528 1.3831839
text226 0.1232341 1.0568551
text227 0.0632262 1.2235817
text228 0.1340066 1.2933366
text229 0.1568734 1.5749283
text230 -0.1552954 1.6842242
text231 0.2568109 1.7236788
text232 0.1192518 1.2863168
text233 0.2767868 0.9657335
text234 0.2220131 1.4171625
text235 0.0767122 1.5293669
text236 0.0899572 1.1870404
text237 0.2805238 1.1493276
text238 0.3385004 0.8956855
text239 0.2382075 1.3677238
text240 0.0598197 0.9015272
text241 0.1841413 0.8218694
text242 0.1486178 0.6378670
text243 0.0974488 0.8344811
text244 0.4087627 0.9011961
text245 0.1388027 1.2521412
text246 -0.0596414 0.9544668
text247 -0.1339271 0.9140466
text248 0.0888315 0.8978757
text249 0.1301501 0.8118579
text250 -0.0304647 0.9340919
text251 -0.0693124 1.1593356
text252 0.2371148 0.6014633
text253 0.1958033 0.7361300
text254 0.2305644 0.5549849
text255 0.2764002 0.6921923
text256 0.4205493 0.6726446
text257 0.3360164 1.1087050
text258 0.3633596 0.8025471
text259 0.3265326 1.0177441
text260 0.3436257 0.7618162
text261 0.4902003 0.7167917
text262 0.2513854 0.7307965
text263 0.3809571 1.0005608
text264 0.3131142 0.8260438
text265 0.3166304 1.0168468
text266 0.1243377 0.8684740
text267 0.2864012 1.2409567
text268 0.2478765 1.1546875
text269 0.1689672 0.6422942
text270 0.2179047 1.2819053
text271 0.0526958 1.2852830
text272 0.0677707 0.8177086
text273 -0.1671832 1.1772657
text274 -0.0006242 0.6593173
text275 -0.1373562 0.9834504
text276 0.0697984 1.4632862
text277 0.4117700 1.0640849
text278 0.2040525 1.0262931
text279 0.1212765 1.0633300
text280 0.3470036 1.2552154
text281 0.1549429 1.1677024
text282 0.1237120 1.2323690
text283 0.0613410 1.5202570
text284 0.0441768 0.4692699
text285 0.0981307 0.9610060
text286 0.1447673 0.9463847
text287 0.1164651 0.7653608
text288 -0.0160568 0.6794191
text289 -0.1327004 0.9045303
text290 -0.4743706 0.8947619
text291 -0.0195981 0.9759460
text292 -0.0085162 0.8480718
text293 -0.0380049 0.8408877
text294 -0.0629620 0.8559886
text295 0.2098236 1.3620696
text296 0.0924235 1.0243678
text297 0.1704798 0.7761873
text298 0.1379732 0.6886703
text299 0.2369709 1.0073311
text300 0.1529778 1.0744045
text301 0.3757342 0.9556000
text302 0.0750957 0.8480658
text303 -0.3038113 0.6104365
text304 0.1599295 1.1754022
text305 0.1841827 1.1063831
text306 0.2564393 1.2526512
text307 0.2062908 1.0281143
text308 0.1098267 1.1459167
text309 0.2564605 1.2891438
text310 0.1243840 1.2414151
text311 0.2246070 1.4339742
text312 0.0755440 1.1100894
text313 0.1149073 1.2573355
text314 0.1995193 1.2110900
text315 0.2758807 1.4792740
text316 0.0909114 1.2901617
text317 0.1842067 1.2761359
text318 0.0872977 1.5420561
text319 0.2707777 1.5251979
text320 0.1741623 1.3240305
text321 -0.1263585 0.8938743
text322 0.1370604 0.6527912
text323 0.2176574 1.2155406
text324 0.2980015 1.4714333
text325 0.1217512 1.1109154
text326 0.2289004 0.4838961
text327 0.5352340 1.4033936
text328 0.3094137 1.3965216
text329 0.1154230 1.3180134
text330 0.2005542 1.6092488
text331 0.1584162 0.9817773
text332 0.1578367 0.8331858
text333 0.0652169 1.4235729
text334 -0.0732387 0.9827854
text335 -0.4061825 0.4202450
text336 0.1127706 0.8895116
text337 0.2799000 1.3777920
text338 0.2530867 0.8906205
text339 0.3461441 1.2699674
text340 0.2113298 1.0341775
text341 0.2548468 1.2501537
text342 0.1010309 1.0084462
text343 0.1165229 0.9374514
text344 0.0753469 0.8770608
text345 0.1925716 1.1192531
text346 0.2741429 0.6935086
text347 -0.0997572 0.8987977
text348 0.0637238 1.0495737
text349 -0.0094023 1.1495513
text350 0.0031767 1.1184810
text351 0.5382452 0.7316173
text352 0.2750296 0.6008495
text353 0.2522371 1.2480422
text354 0.2970343 1.0216914
text355 0.0132278 1.6997303
text356 0.3611439 1.5958274
text357 0.2785375 1.3344684
text358 0.3123503 1.7428015
text359 0.0535705 1.1758393
text360 0.2878846 1.2147321
text361 0.4476029 1.0685590
text362 0.3684231 1.2136334
text363 0.2735909 1.3490537
text364 0.0840526 1.3486912
text365 0.4231150 1.0064790
text366 0.6060958 1.2270644
text367 0.1604078 1.5152518
text368 -0.2527088 0.8410967
text369 0.0023180 0.8276788
text370 -0.1045830 0.7573075
text371 0.1907948 0.8989631
text372 0.2297841 1.4570339
text373 0.5932923 1.1783742
text374 0.3204649 1.0030613
text375 0.3067290 1.2715210
text376 0.1503315 1.4781644
text377 0.1548002 1.3354755
text378 0.0554129 1.0438405
text379 0.2731972 0.8962324
text380 0.2715002 0.9440911
text381 0.1899227 0.8741739
text382 0.2075463 0.7351555
text383 0.1450116 1.3032269
text384 -0.0514644 0.7342315
text385 -0.2141338 0.9000164
text386 -0.0080367 0.8107373
text387 0.0628881 1.0686682
text388 0.2080283 1.1786778
text389 0.2787183 0.9897104
text390 0.0046930 1.2886859
text391 0.0521668 0.8411117
text392 0.2085749 0.9292644
text393 0.1382104 0.9822519
text394 0.1340452 0.9716209
text395 -0.1894566 0.3642607
text396 -0.0982460 0.6919130
text397 0.0521668 0.8411117
text398 0.2085749 0.9292644
text399 -0.1252334 1.4979629
text400 0.2949881 1.2997739
text401 0.0926698 1.3807638
text402 -0.2163907 1.1767950
text403 0.1997122 1.8587194
text404 -0.1252334 1.4979629
text405 0.2949881 1.2997739
text406 0.0926698 1.3807638
text407 -0.2163907 1.1767950
text408 0.1997122 1.8587194
text409 0.2481364 1.6120094
text410 0.0733254 1.2376017
text411 -0.1336642 1.4339600
text412 0.0204443 0.9180330
text413 0.0002268 1.1281116
text414 0.0707660 1.2138869
text415 0.1812503 1.2010917
text416 0.3278825 1.1913722
text417 0.3358287 1.2613316
text418 0.3800939 1.4850804
text419 0.4301201 1.2142355
text420 0.3767513 1.3272340
text421 0.4659868 1.1577309
text422 0.1356996 0.6300363
text423 0.0725225 0.6001274
text424 0.2442976 0.9726046
text425 -0.0313015 0.8777947
text426 0.1336745 0.9392286
text427 0.2020204 1.5396711
text428 0.2173538 0.8540702
text429 0.1335206 0.6872227
text430 0.1696999 0.9872003
text431 0.1549549 0.6828515
text432 0.0231772 0.3736893
text433 0.0824346 0.9532260
text434 -0.0133299 0.2455839
text435 0.2646559 0.4379726
text436 -0.2438002 1.1847457
text437 -0.1014497 0.6228988
text438 -0.1696046 0.9872076
text439 0.2551783 1.0689967
text440 0.5450359 0.9920608
text441 0.3565996 1.2405279
text442 0.1758428 0.9832103
text443 0.1136932 0.8490236
text444 0.2263379 1.0743313
text445 0.1498076 1.0639487
text446 0.0714210 0.7103076
text447 -0.0366029 0.8013552
text448 0.0899807 0.7702128
text449 0.2453639 0.6915798
text450 0.1546315 1.0070268
text451 0.1518412 0.9583322
text452 0.2012193 0.7653423
text453 0.0328019 0.9968289
text454 0.1359922 1.1640020
text455 0.2357992 0.7843136
text456 0.1708265 0.7259609
text457 0.1060457 0.6356697
text458 0.1122065 0.7623059
text459 -0.0510890 0.6837765
text460 0.2729258 0.7181630
text461 0.0353519 0.7040392
text462 -0.0529748 0.9242346
text463 0.2445888 1.1379589
text464 0.1349239 0.8928514
text465 0.1685257 0.8457146
text466 -0.0101126 1.0219800
text467 0.0449932 1.3686054
text468 0.2139053 1.5091483
text469 0.3327221 0.9166890
text470 0.1601532 1.2372156
text471 0.2300886 1.0976310
text472 0.2671393 1.5154245
text473 0.1533582 1.3228465
text474 0.2223718 1.2687023
text475 0.4806500 1.0706347
text476 0.1971739 1.2958950
text477 0.1862265 1.2100585
text478 0.1549096 1.2951603
text479 0.2016123 1.2247494
text480 0.1622681 1.1873738
text481 0.1558067 1.3113223
text482 0.0713136 1.1196114
text483 0.0793238 1.2609556
text484 -0.1042415 1.2323748
text485 0.0632402 1.1570354
text486 0.0392449 1.0144609
text487 0.1836528 1.3300136
text488 0.0028259 1.0767975
text489 -0.0250519 1.3861285
text490 0.0345322 1.1033145
text491 0.0288574 1.3404222
text492 -0.0645710 1.2996132
text493 0.1170103 1.7309555
text494 0.1846781 1.3579485
text495 0.0632730 1.3571336
text496 0.1521215 0.7577023
text497 0.1942529 0.8191957
text498 0.1498967 0.5490129
text499 0.0102528 0.3737902
text500 0.0905686 0.3068462
text501 0.1242028 0.8062680
text502 0.1404473 0.6648033
text503 0.3291785 1.0940981
text504 0.1316472 0.8233959
text505 0.2904246 0.9009625
text506 0.1243700 0.9774672
text507 0.1638735 0.8647245
text508 0.2819823 1.2442189
text509 0.1067542 1.2086218
text510 0.1241403 1.0128591
text511 0.2743866 0.9302540
text512 0.1373397 0.9182115
text513 0.1147225 0.9898932
text514 0.2746110 1.2474773
text515 0.0740749 1.1993319
text516 -0.0017611 0.9818134
text517 0.1795542 1.0876154
text518 0.0270007 1.2021413
text519 0.0356074 0.9276651
text520 0.1917471 0.7631593
text521 0.0598408 0.8182707
text522 0.0542529 0.7328814
text523 0.1503243 0.8883849
text524 0.2146546 0.8594481
text525 0.1419021 1.0429613
text526 0.0949537 0.6306507
text527 -0.0046088 1.0978067
text528 0.2934667 0.9444246
text529 0.2792509 0.9759116
text530 -0.0470254 0.7225935
text531 0.1088841 1.1344075
text532 0.1120258 0.8513351
text533 0.0641251 0.9549298
text534 0.1558120 1.1038038
text535 0.2261778 1.4809621
text536 -0.0218617 0.6892562
text537 -0.0753286 0.7210568
text538 -0.1472017 0.7074476
text539 0.0773363 0.7334986
text540 0.2629169 0.9590859
text541 0.2553411 0.8651482
text542 0.1327599 0.9283452
text543 0.0122024 1.5916119
text544 0.2574685 0.7804765
text545 0.4209444 0.9719367
text546 0.2445271 0.6803097
text547 0.1006317 0.9731161
text548 0.2274044 0.9423069
text549 0.1800313 0.7586309
text550 0.1290848 1.1553825
text551 0.1224858 0.4822831
text552 0.1177696 0.3762521
text553 0.0962783 0.9087222
text554 0.1324764 0.8392423
text555 0.2004452 0.9220951
text556 0.2181000 0.9899325
text557 0.0638444 1.0491612
text558 0.1868832 0.9680698
text559 0.2265398 0.7812660
text560 0.0284606 0.6249801
text561 0.0790340 0.7810454
text562 0.0430094 0.9531501
text563 -0.0241125 0.7118596
text564 -0.0796566 0.9054338
text565 -0.1066133 0.9523356
text566 -0.1003383 0.9999259
text567 -0.0397728 0.7133478
text568 -0.2219440 0.7735991
text569 -0.0893612 1.0997754
text570 -0.1774156 0.1418873
text571 -0.6743657 0.9126886
text572 -0.4416227 0.5826479
text573 -0.3571179 0.5638079
text574 -0.5634615 0.5981649
text575 -0.5912337 0.6087442
text576 -0.6744890 0.6873482
text577 -0.4717779 0.6120402
text578 -0.2310855 0.6470850
text579 -0.2782228 0.2594475
text580 -0.5075055 0.3965492
text581 -0.4697073 0.3609161
text582 -0.4845070 0.4230874
text583 -0.1957018 0.5259744
text584 -0.0899664 0.5733234
text585 -0.1706628 0.7065632
text586 -0.1811801 0.8320661
text587 -0.1001644 0.7070070
text588 0.0674332 0.7487251
text589 0.0053636 0.3768532
text590 0.0850086 1.0646492
text591 0.0332603 0.9524806
text592 0.0157475 0.5996019
text593 -0.5518798 1.1046515
text594 -0.4679275 0.8961331
text595 -0.2173421 0.7954716
text596 -0.6670471 1.0424170
text597 -0.5046653 1.0834736
text598 -0.5956065 1.1297585
text599 -0.3513586 0.3992078
text600 -0.3953001 0.4651671
text601 -0.4495688 0.4101807
text602 -0.2901027 0.6469007
text603 -0.3961941 0.8558574
text604 -0.2110136 1.0899712
text605 -0.3497475 1.1641916
text606 -0.3134303 1.0609314
text607 -0.1725889 0.9413818
text608 -0.4880575 0.8030919
text609 -0.7580460 0.7027283
text610 -0.4640756 0.7288378
text611 -0.4520538 0.6915819
text612 -0.3247900 0.6754737
text613 -0.1479084 0.6109248
text614 -0.2222537 0.9399553
text615 -0.2584633 0.6266216
text616 -0.4818957 0.9042857
text617 0.1198940 0.5299939
text618 0.0214354 0.5202070
text619 -0.1054918 0.8482319
text620 -0.2705916 1.4367835
text621 -0.3960796 0.7977543
text622 -0.4488190 0.4741249
text623 -0.4371644 0.6902745
text624 -0.4150067 0.4981821
text625 -0.3156891 0.7923081
text626 -0.5768371 0.9436564
text627 -0.3395397 1.0379021
text628 0.1653542 1.3355483
text629 -0.1119999 1.2915232
text630 -0.5262279 0.9343946
text631 -0.1805352 0.7390178
text632 0.4307379 0.9996313
text633 -0.4937870 0.9701163
text634 -0.7730387 0.8762682
text635 -0.4718234 0.9350437
text636 -0.4573124 1.0143312
text637 -0.5267115 0.9414445
text638 -0.5453725 1.0108598
text639 -0.2186935 1.1016111
text640 -0.2334620 0.9371942
text641 -0.2259372 0.9145094
text642 -0.6338933 0.7478535
text643 -0.6613983 0.9237563
text644 -0.5896489 0.9507558
text645 -0.3646874 0.9273459
text646 -0.4235650 0.8524619
text647 -0.6678772 0.9188484
text648 -0.3578404 1.2113449
text649 -0.2188638 0.8031349
text650 -0.8879380 0.7188164
text651 -0.3068363 0.6645741
text652 -0.0038029 0.6013472
text653 -0.1467379 0.9739340
text654 -0.3511889 0.8734875
text655 -0.2463001 0.8162045
text656 -0.4014048 0.7799469
text657 -0.0451758 0.6692167
text658 -0.1178337 0.8491158
text659 -0.0332889 0.5536518
text660 -0.1328049 0.5389537
text661 -0.4041252 0.7129988
text662 -0.2092501 0.9487000
text663 -0.1314968 0.6619334
text664 -0.0492151 0.8275303
text665 -0.1012532 0.6230340
text666 -0.0874954 1.0139396
text667 -0.0036721 0.9218346
text668 -0.0009457 0.8368988
text669 -0.3430480 1.4459770
text670 -0.4295281 1.0928904
text671 -0.2122101 1.1605170
text672 -0.2436627 1.2463614
text673 -0.2870211 1.0223365
text674 -0.1362666 1.4789402
text675 -0.0953558 0.1036861
text676 -0.2601519 0.2384644
text677 -0.3523246 1.5029286
text678 0.0242904 1.2811373
text679 -0.2788351 1.0148996
text680 -0.1285680 1.1626853
text681 -0.4798862 1.5645046
text682 -0.4291353 1.2074533
text683 -0.3555346 1.6667744
text684 -0.1883917 1.3868675
text685 -0.0721905 1.5414399
text686 -0.1316319 2.6781971
text687 -0.7213936 0.5560583
text688 -1.1250785 0.6150218
text689 -0.3618599 0.4318433
text690 -0.2099862 0.5416974
text691 -0.5530933 0.4802515
text692 -0.3195881 0.5245011
text693 -0.4278524 0.4156212
text694 -0.3767419 0.7732510
text695 -0.0607786 0.1175083
text696 -0.0862282 0.2555798
text697 -0.4192074 0.9289891
text698 -0.4210247 0.7956382
text699 -0.2772598 1.0900257
text700 -0.3568950 0.8745598
text701 -0.1583655 0.8557344
text702 -0.0050127 0.9934683
text703 -0.0232393 0.9996038
text704 -0.1661068 1.2275604
text705 -0.3163880 1.0124067
text706 -0.0403067 0.9861922
text707 -0.6867066 1.0056899
text708 -0.5114374 0.8635337
text709 -0.7088305 1.1319117
text710 -0.3406860 1.1960019
text711 -0.5431881 1.0415465
text712 -0.7891697 1.0842480
text713 -0.4616657 0.6822292
text714 -0.2845886 0.9255431
text715 -0.2797810 0.8041001
text716 -0.1538460 0.8029758
text717 -0.1298250 0.9132670
text718 -0.4610198 0.7666405
text719 -0.4562896 0.9446694
text720 -0.4032770 0.7385893
text721 -0.4847097 0.5413386
text722 -0.0702779 0.6134810
text723 -0.1329099 0.7249486
text724 -0.0846191 0.9949597
text725 -0.0754719 0.6865021
text726 -0.1676976 0.5430030
text727 -0.1303644 0.5416905
text728 -0.1716150 0.8614310
text729 0.0545437 0.9274509
text730 -0.1773134 2.0416765
text731 -0.8535687 0.8172287
text732 -0.4408066 0.9607869
text733 -0.5972137 0.8172511
text734 -0.4418974 0.4492102
text735 -0.7238419 0.3560445
text736 -0.6325224 0.3818354
text737 -0.3984298 0.8110668
text738 -0.2969380 0.8553582
text739 -0.3947518 0.8425078
text740 -0.1219617 0.9225565
text741 0.0057782 1.1751202
text742 0.1119432 0.9579791
text743 0.0183077 1.2298679
text744 0.2534414 1.0666439
text745 0.1185736 1.0232196
text746 0.0568102 1.2394438
text747 -0.5126935 0.4154000
text748 -0.6776828 0.7178017
text749 -0.3935113 0.5721912
text750 -0.5038496 0.7110794
text751 -0.1375776 1.4051980
text752 -0.1157708 0.4153570
text753 -0.0876400 0.3302515
text754 -0.5446335 0.9809894
text755 -0.5455769 0.8600799
text756 -0.2704715 1.0109342
text757 -0.3406195 0.9198759
text758 -0.3915088 1.3352956
text759 -0.4015501 0.9447598
text760 -0.3434837 0.6224648
text761 -0.4222075 0.7591963
text762 -0.3238936 0.5980327
text763 -0.3275784 0.8055621
text764 -0.0281772 0.8618324
text765 -0.1366684 1.3018665
text766 0.1748615 1.3469365
text767 -0.1815514 1.2215202
text768 0.0624410 0.6947332
text769 -0.1982144 1.0853228
text770 0.2963328 1.1464061
text771 -0.6495655 1.0384474
text772 -0.4368956 1.0732684
text773 -0.4578248 0.9165752
text774 -0.5364513 0.5541742
text775 -0.3959694 0.8791355
text776 -0.4677290 0.9504861
text777 -0.1519732 0.5710091
text778 -0.1129671 0.9388844
text779 0.0775168 0.6895063
text780 -0.0458236 1.2268457
text781 0.0924929 1.0927524
text782 -0.4656878 0.9459527
text783 -0.6348481 0.8153092
text784 -0.3292925 0.4095542
text785 -0.4585426 0.4946776
text786 -0.4539692 0.6322593
text787 -0.6739630 1.1740363
text788 -0.5856637 0.7828773
text789 -0.4688540 0.9745288
text790 -0.3521620 1.0850020
text791 0.0038098 1.0303092
text792 -0.0810560 1.0333256
text793 -0.0282240 0.9478714
text794 -0.2521663 0.6421054
text795 -0.6470274 0.6034428
text796 -0.3331735 0.1664381
text797 -0.4582009 0.4558647
text798 -0.3965777 0.4446725
text799 -0.2570622 0.7826721
text800 -0.0352630 0.5269162
text801 -0.1218880 0.5577660
text802 -0.0184499 0.7158864
text803 -0.3795759 0.6305456
text804 -0.4687824 0.7631428
text805 -0.2867627 0.7542224
text806 -0.4223627 0.7852757
text807 -0.4077501 0.8652707
text808 -0.6776403 0.8255749
text809 -0.8718325 1.0824963
text810 -0.6237112 0.8862662
text811 -0.5777079 1.0322141
text812 -0.1996612 0.8115108
text813 -0.1192005 1.1613746
text814 -0.2963676 0.9341090
text815 -0.1366221 1.1373283
text816 -0.0811709 0.8023424
text817 -0.0716778 1.1532564
text818 -0.7143932 0.6211278
text819 -0.0910071 1.0110322
text820 -0.0901703 1.0442036
text821 -0.1586518 0.9930563
text822 -0.3171798 0.7466884
text823 -0.0419704 0.8978948
text824 0.0911235 0.7371285
text825 -0.6719764 0.1955343
text826 -0.6311665 0.6466414
text827 -0.5223687 0.5192003
text828 -0.2271423 0.6291981
text829 -0.3706227 0.7994665
text830 -0.1706214 0.9317494
text831 -0.1620727 0.7181939
text832 -0.1525986 0.6926467
text833 -0.3724752 1.1151025
text834 -0.4648920 0.8538585
text835 -0.3235853 1.0288801
text836 -0.2383269 1.1824279
text837 -0.4809522 0.6422388
text838 -0.2025997 0.6855189
text839 -0.2402904 0.7402713
text840 -0.2354439 0.5578689
text841 -0.6088394 0.8360844
text842 -0.3679216 0.7629532
text843 -0.5676467 0.8905570
text844 -0.3468091 0.6770785
text845 -0.3435275 0.6917593
text846 -0.2673242 0.9160028
text847 -0.2867814 0.6352791
text848 -0.5606992 0.7178357
text849 -0.6491470 0.6248535
text850 -0.5504638 0.7262145
text851 -0.2352024 0.0787821
text852 -0.2266407 0.7667178
text853 -0.0559872 0.9431587
text854 -0.2780666 0.7684829
text855 -0.0677183 1.0247805
text856 -0.3454293 0.9995078
text857 -0.6631546 0.6825334
text858 -0.7210345 0.7177850
text859 -0.7273853 1.1498385
text860 -0.6965343 0.8568419
text861 -0.3785318 0.4990175
text862 -0.8968197 0.7100445
text863 -0.6790469 0.8642015
text864 -0.1749767 1.5809346
text865 -0.2309313 0.4178946
text866 -0.2992679 0.7670551
text867 -0.4389249 0.6683996
text868 -1.0229009 0.9850638
text869 -0.7347979 0.6809587
text870 -0.4586053 0.6475862
text871 -0.4125278 0.8514990
text872 -0.5475942 0.7896659
text873 -0.4049501 0.8344816
text874 -0.3220715 1.0823418
text875 -0.4973601 0.6589873
text876 -0.6277899 0.7779903
text877 -0.5047387 0.4192344
text878 -0.5957179 0.8460009
text879 -0.3609231 0.9905518
text880 -0.2753748 1.0137573
text881 -0.0312519 1.0762338
text882 -0.1721183 0.7402833
text883 -0.1484172 0.8166461
text884 -0.4671840 0.7684356
text885 -0.0867130 0.6401568
text886 -0.1830547 0.9033608
text887 -0.4615773 1.0654340
text888 -0.3267538 0.9201636
text889 -0.5097092 0.7366949
text890 -0.3018938 0.6177093
text891 -0.6529744 0.6526621
text892 -0.4336734 1.2026469
text893 -0.5194879 0.9822813
text894 -0.6043584 0.7754007
text895 -0.5840008 0.9286759
text896 -0.6512633 0.7115754
text897 -0.6691269 0.6771917
text898 -0.3007225 1.4620001
text899 -0.3183202 0.7929353
text900 -0.2130304 0.8373273
text901 0.1261961 0.6907629
text902 -0.0520970 0.5458974
text903 -0.4363444 0.6816358
text904 -0.3530633 0.9066949
text905 -0.3623601 1.3863256
text906 -0.3403692 0.8569560
text907 -0.6942021 0.7998495
text908 -0.7825219 0.9716430
text909 -0.5429785 0.7632283
text910 -0.3654662 0.9809451
text911 -0.8428726 0.5694277
text912 -0.5871936 0.6481161
text913 -0.7594136 0.8783500
text914 -0.4282235 0.6123999
text915 -0.5356494 0.4947941
text916 -0.3987934 0.4948021
text917 -0.6237915 0.6439744
text918 -0.4311510 0.6856392
text919 -0.5477471 0.8228603
text920 -0.8193537 0.8240617
text921 -0.3088909 0.6761186
text922 -0.4995630 0.8269923
text923 -0.3785318 0.4990175
text924 -0.8968197 0.7100445
text925 -0.6790469 0.8642015
text926 -0.1749767 1.5809346
text927 -0.6237915 0.6439744
text928 -0.4311510 0.6856392
text929 -0.5477471 0.8228603
text930 -0.8193537 0.8240617
text931 -0.3088909 0.6761186
text932 -0.4995630 0.8269923
text933 -0.9616424 0.8503101
text934 -0.8703103 0.8294353
text935 -0.8706884 0.8943445
text936 -0.6900294 0.5116040
text937 -0.6976343 0.5702693
text938 -0.8428726 0.5694277
text939 -0.5871936 0.6481161
text940 -0.7594136 0.8783500
text941 -0.4282235 0.6123999
text942 -0.5356494 0.4947941
text943 -0.3987934 0.4948021
text944 -0.4671840 0.7684356
text945 -0.0867130 0.6401568
text946 -0.1830547 0.9033608
text947 -0.4615773 1.0654340
text948 -0.3267538 0.9201636
text949 -0.2318906 0.8475063
text950 0.0530842 0.5998579
text951 -0.0719716 0.4754138
text952 -0.3766863 0.7006619
text953 -0.4700170 0.5087136
text954 -0.3232845 0.7627999
text955 -0.2442414 0.7263224
text956 -0.2205046 0.8817398
text957 -0.1797250 0.2929293
text958 -0.2162018 0.4801230
text959 -0.0960223 0.7857447
text960 -0.0295386 0.4669265
text961 -0.1357896 0.6540186
text962 -0.4198725 0.7157978
text963 -0.1465194 1.0314176
text964 -0.1944199 0.5221055
text965 -0.3507749 0.6323286
text966 -0.1014511 0.2711816
text967 -0.2977467 0.3541963
text968 -0.1096252 0.7356950
text969 -0.2490675 0.7964177
text970 -0.3216088 0.8600188
text971 0.0935136 0.4512242
text972 0.0687462 1.1740931
text973 0.0158580 0.1076698
text974 -0.0080791 0.2015771
text975 -0.1773646 0.5503033
text976 0.4655863 1.7303845
text977 0.5662424 0.8049590
text978 0.4319236 0.8075971
text979 0.2968820 0.7374531
text980 0.4929860 0.8871761
text981 0.3790213 0.8663927
text982 0.4017249 0.9970099
text983 0.4952421 1.2951288
text984 0.4776135 0.9828199
text985 0.3561827 1.3063646
text986 0.2622178 1.6467679
text987 0.2513543 1.3209199
text988 0.2280506 1.2122466
text989 0.1131308 0.9060535
text990 -0.0278521 1.1924430
text991 -0.0866655 1.0123395
text992 0.1604443 0.9879955
text993 -0.1499835 1.3655398
text994 -0.0743863 0.5583446
text995 0.0606748 0.9355191
text996 0.0970337 0.7924852
text997 0.2629888 0.9673728
text998 0.3347929 0.9333871
text999 0.2848606 1.0883280
text1000 0.1823654 0.8965222
text1001 -0.0124436 0.9210574
text1002 -0.0067177 0.7681625
text1003 0.1040822 0.9749188
text1004 -0.0122664 0.5694958
text1005 0.0464052 0.8542614
text1006 0.1007423 0.9104068
text1007 0.0259731 0.6815478
text1008 -0.0276736 1.0649898
text1009 -0.0367378 0.7651566
text1010 -0.0698535 0.9082765
text1011 -0.0398526 1.0449947
text1012 0.1374978 0.9852999
text1013 0.1017235 0.5668717
text1014 0.0316354 0.7956082
text1015 0.1774770 1.2705981
text1016 0.0138089 1.0525861
text1017 0.0462369 0.8578859
text1018 -0.0400262 0.9567213
text1019 0.1031065 1.6255466
text1020 0.1695521 1.3860615
text1021 0.1355586 1.2821697
text1022 -0.0492744 1.3376789
text1023 -0.1272024 1.1367148
text1024 0.2100100 1.2236695
text1025 0.3144631 0.9370909
text1026 0.2592416 1.2829167
text1027 0.3003862 1.2784034
text1028 -0.1057521 1.1240586
text1029 0.0998077 0.9644991
text1030 0.0522944 1.1589124
text1031 0.4024329 1.2129160
text1032 0.0585493 0.9617760
text1033 0.2730269 1.2581946
text1034 0.3014885 1.0367073
text1035 0.3343930 1.0010939
text1036 0.2119698 1.0126672
text1037 0.1475891 0.9410584
text1038 -0.2127610 0.2220236
text1039 -0.1011108 0.4599201
text1040 0.1850257 0.8990477
text1041 0.2674030 0.5362533
text1042 0.2236284 0.8598737
text1043 0.3352863 1.1940975
text1044 0.1188715 0.9065991
text1045 0.0739330 1.0830979
text1046 0.0276703 0.5911657
text1047 0.0581552 0.9863633
text1048 0.1687326 1.4672664
text1049 0.2703515 1.1796326
text1050 0.1066340 1.2474705
text1051 0.2342226 0.6104525
text1052 0.1221908 0.7040772
text1053 0.0772019 0.9597367
text1054 0.2523868 0.6531317
text1055 0.0773562 0.7467329
text1056 0.0289874 1.1500669
text1057 -0.0496293 0.8492439
text1058 0.0979417 0.8237944
text1059 0.1417678 0.9018201
text1060 0.2079900 0.8140763
text1061 0.2020270 0.7859031
text1062 -0.0691573 0.6408503
text1063 0.2868777 0.5902338
text1064 0.0477293 1.0290881
text1065 0.2713783 0.7021185
text1066 0.1916732 1.1453647
text1067 0.3090779 1.1484632
text1068 0.3584277 0.7978520
text1069 0.1609923 1.0096328
text1070 0.0752810 0.9049087
text1071 0.0883837 1.0469442
text1072 0.1445010 1.3432230
text1073 0.1249066 0.8757257
text1074 -0.0191659 0.7925458
text1075 -0.0042640 0.4154855
text1076 -0.0639788 0.6816623
text1077 0.0305085 1.0416849
text1078 0.1395404 1.2751262
text1079 0.0319963 0.8726999
text1080 0.0006280 0.9818398
text1081 -0.0756211 1.0283793
text1082 -0.5493332 0.5822986
text1083 0.0305085 1.0416849
text1084 0.1395404 1.2751262
text1085 0.0319963 0.8726999
text1086 0.0006280 0.9818398
text1087 -0.0756211 1.0283793
text1088 -0.5493332 0.5822986
text1089 0.1443505 0.9723998
text1090 0.1846440 0.9371560
text1091 0.2484778 0.7427102
text1092 0.1550841 0.5709646
text1093 0.0849679 0.9795707
text1094 0.1197092 0.9360526
text1095 0.3541190 1.0263526
text1096 0.2901877 0.6508205
text1097 0.2099246 0.6535676
text1098 0.2735708 0.9991789
text1099 0.1434927 0.8680637
text1100 0.0357132 0.9503400
text1101 0.3109447 0.6794580
text1102 0.2335303 1.1632765
text1103 0.5659134 0.5912980
text1104 0.2620262 0.7815202
text1105 0.2728303 0.7733232
text1106 0.2215696 0.7898284
text1107 0.4418483 0.8769580
text1108 0.3235849 1.1229347
text1109 0.1910256 0.7141245
text1110 0.1728770 0.6531022
text1111 -0.0124922 0.5581412
text1112 0.3130671 0.6322856
text1113 0.4808001 0.8094517
text1114 0.4099735 0.8149682
text1115 0.3882185 0.8467661
text1116 0.4799955 0.7692207
text1117 0.2703097 0.4267747
text1118 0.1573143 0.9774796
text1119 0.2232350 1.1915018
text1120 0.1572908 1.1635821
text1121 0.1746184 1.3592351
text1122 0.1495057 1.7071109
text1123 -0.0624956 0.9341573
text1124 -0.1182914 0.9374916
text1125 0.0004967 1.0281966
text1126 0.0307854 0.9381255
text1127 0.0882585 0.8032732
text1128 -0.0021088 0.7827577
text1129 0.0298685 1.4407357
text1130 -0.0207845 0.6499777
text1131 -0.3895138 0.5451982
text1132 -0.2843612 0.4874065
text1133 0.1738013 0.9086301
text1134 0.1759385 0.8747087
text1135 0.0686148 0.8882615
text1136 0.1072812 1.1830565
text1137 0.1326394 0.8770407
text1138 0.1368492 1.0095475
text1139 -0.0830932 0.9313470
text1140 0.0104075 0.8043634
text1141 0.0372801 0.7731642
text1142 0.0583955 0.7914146
text1143 0.2158139 0.7795796
text1144 0.0505890 0.9674211
text1145 0.1578712 0.7862868
text1146 0.2574948 0.8391570
text1147 0.1036481 1.1885905
text1148 0.2872864 1.1014460
text1149 0.1532175 1.3233338
text1150 0.1474465 1.2417014
text1151 0.2229123 0.8699301
text1152 0.0672697 1.0619227
text1153 -0.0422613 1.3219639
text1154 0.2361282 0.9862171
text1155 0.2920263 0.7611879
text1156 0.1168131 0.9170818
text1157 0.3425109 1.0163619
text1158 0.1756307 0.9549125
text1159 0.1728817 0.8392791
text1160 0.2935481 1.0584971
text1161 0.0887758 0.9083070
text1162 0.0797651 0.9446836
text1163 0.0945497 1.1082848
text1164 0.1589450 0.8172916
text1165 0.1183247 0.8120078
text1166 0.0389104 0.7591319
text1167 0.0242668 0.8856472
text1168 0.0062815 0.8270773
text1169 0.1488574 0.6826920
text1170 0.1023067 0.5988876
text1171 0.2382737 1.0013737
text1172 0.1723185 0.9344613
text1173 0.0815202 0.9135793
text1174 0.0062815 0.8270773
text1175 0.1488574 0.6826920
text1176 0.1023067 0.5988876
text1177 0.2382737 1.0013737
text1178 0.1723185 0.9344613
text1179 0.0815202 0.9135793
text1180 0.1087344 0.9726344
text1181 0.2283193 0.7449377
text1182 0.1475672 0.8241030
text1183 0.1427060 0.7464707
text1184 0.1362472 0.9399803
text1185 0.1839542 1.3823820
text1186 0.2126735 1.4761602
text1187 0.1941573 0.9812011
text1188 0.2867269 0.8329324
text1189 0.1941573 0.9812011
text1190 0.2867269 0.8329324
text1191 0.1304316 0.8477585
text1192 0.0821841 0.5626128
text1193 0.1670747 0.7892659
text1194 0.1627274 1.0045132
text1195 0.3721469 1.0220342
text1196 0.3111596 0.9172255
text1197 0.3277916 0.7507535
text1198 -0.0222856 0.7698273
text1199 0.0252587 0.8422788
text1200 0.0132875 0.8466386
text1201 0.0922984 0.9088842
text1202 0.0745157 0.8127808
text1203 0.3861644 0.9386805
text1204 0.0494980 0.4540799
text1205 0.1586325 0.8929052
text1206 0.0130252 0.5965252
text1207 0.0115343 0.3425971
text1208 0.1772557 0.5391316
text1209 0.2463820 0.3933098
text1210 0.1008256 0.4666824
text1211 0.0813434 0.4742154
text1212 0.1287092 0.2430458
text1213 -0.0319726 0.1971676
text1214 0.0642406 0.4124218
text1215 0.1343431 0.3181788
text1216 0.1606514 0.8142876
text1217 -0.0472160 0.8010805
text1218 0.0785154 1.0843249
text1219 -0.0183457 0.7244028
text1220 0.1130342 0.8581800
text1221 0.3045332 0.9726925
text1222 0.1695312 0.7796290
text1223 0.2462753 0.6284800
text1224 -0.2319029 0.8521453
text1225 -0.0535760 1.1701227
text1226 -0.2074544 1.1598613
text1227 -0.0695915 1.1524936
text1228 -0.1958222 1.1814484
text1229 -0.1532177 0.9168358
text1230 -0.1965412 1.1085161
text1231 0.0190827 1.3860591
text1232 -0.3007640 0.9198448
text1233 0.0293643 0.6701465
text1234 0.0958911 0.6418807
text1235 0.1100475 0.7245392
text1236 0.1464675 1.0546426
text1237 0.0775180 0.8215441
text1238 0.0484404 0.7471166
text1239 -0.0509283 0.9579493
text1240 0.1288441 0.5968052
text1241 0.1059107 0.9848591
text1242 0.1042978 0.8835260
text1243 0.2213844 0.7054364
text1244 0.0892425 0.5608497
text1245 0.2095197 0.7527043
text1246 0.1492514 1.1287640
text1247 0.2640185 1.3167356
text1248 0.1742945 1.3123642
text1249 0.0410340 0.8265349
text1250 0.1276694 1.4048756
text1251 -0.0847675 1.0532530
text1252 0.3490647 1.5836339
text1253 0.3084887 1.3572340
text1254 0.0604496 1.6707596
text1255 -0.1626055 0.8824365
text1256 -0.0146590 0.8736257
text1257 0.2162952 0.7802987
text1258 0.0798371 1.0817433
text1259 0.0839922 1.0799654
text1260 0.2544434 0.5489599
text1261 0.2903501 0.5203747
text1262 0.7814468 0.0958065
text1263 0.0210617 1.3375758
text1264 0.0574856 1.0176267
text1265 0.0810668 1.0989348
text1266 -0.1482088 1.3730421
text1267 -0.0220518 1.4465485
text1268 0.0274057 1.6102377
text1269 -0.0774403 1.0281778
text1270 0.0079655 1.0638370
text1271 -0.0161639 1.2168095
text1272 0.1340337 1.1580593
text1273 0.0066175 1.0218277
text1274 0.1987606 1.1857357
text1275 0.1123104 1.3381702
text1276 0.2247566 1.1438798
text1277 0.4055349 0.9299117
text1278 0.2536853 1.0436552
text1279 0.3348466 1.1098767
text1280 0.3728569 1.1424476
text1281 0.0614996 1.2595571
text1282 0.0375471 1.1149254
text1283 0.1322887 1.1992850
text1284 0.1566714 1.3494147
text1285 0.1020969 1.1536522
text1286 0.1047851 1.1763498
text1287 -0.2758934 0.7713650
text1288 0.0004687 1.2953507
text1289 -0.1890184 0.9717635
text1290 0.0035372 0.6523171
text1291 -0.0436774 1.0795713
text1292 -0.1221466 0.9631866
text1293 -0.1324151 1.0755879
text1294 0.1779201 0.9013799
text1295 0.0371465 0.6925092
text1296 0.0002111 0.6041124
text1297 0.1209658 0.7334889
text1298 0.0434935 0.8514487
text1299 0.2273943 1.0061994
text1300 0.3687350 1.3995109
text1301 0.3637332 1.2562476
text1302 0.2632843 1.0500123
text1303 0.2365006 0.8245333
text1304 0.1759834 1.4264938
text1305 0.0964972 1.2974973
text1306 0.0390681 1.3402410
text1307 0.1141848 0.8957681
text1308 0.0878541 0.8913912
text1309 -0.0230394 1.2840241
text1310 0.0234012 0.8897705
text1311 0.1962845 1.4192445
text1312 0.0831809 1.1210795
text1313 0.0574853 1.1170758
text1314 -0.0026380 1.1996934
text1315 0.0171268 1.0849183
text1316 0.0651772 0.6193959
text1317 0.1992779 1.0224996
text1318 0.2598776 0.7616433
text1319 0.3840328 0.9823851
text1320 0.2333954 0.9790415
text1321 0.0828280 0.9392707
text1322 0.1257126 0.4881279
text1323 0.1651947 0.7698028
text1324 0.1340283 0.7354179
text1325 0.1044201 0.4255342
text1326 -0.1739807 0.7141635
text1327 -0.0465079 0.5272888
text1328 -0.0187593 1.0354483
text1329 -0.0216980 0.6847255
text1330 -0.0915281 1.2339921
text1331 0.0309862 1.1071201
text1332 -0.0049162 0.6418432
text1333 -0.0588839 0.9676421
text1334 -0.2547062 1.4925475
text1335 -0.0311887 0.7672741
text1336 -0.0757737 0.4728082
text1337 0.0399413 1.2319988
text1338 0.0831286 1.0787227
text1339 -0.0220468 0.7422347
text1340 0.1579500 0.8201708
text1341 0.1077801 0.5732425
text1342 0.0263733 0.8089639
text1343 -0.1059073 0.4742979
text1344 0.2623485 0.8115160
text1345 0.0797328 0.6754982
text1346 0.0720665 0.5428458
text1347 0.1460702 0.7978812
text1348 0.2629888 0.9673728
text1349 0.3347929 0.9333871
text1350 0.2848606 1.0883280
text1351 0.1823654 0.8965222
text1352 -0.0124436 0.9210574
text1353 0.1923905 0.9157000
text1354 0.0906677 0.6906677
text1355 0.1758545 0.5888676
text1356 0.2703526 0.8997707
text1357 0.3310020 1.0689586
text1358 -0.0861792 0.9085667
text1359 -0.0112036 0.4624365
text1360 -0.0309409 0.7599615
text1361 -0.2007628 0.8241110
text1362 -0.0251794 1.0772523
text1363 0.1046783 0.9113489
text1364 0.1305066 0.8287736
text1365 0.0182699 0.9368822
text1366 0.0893558 0.7730772
text1367 0.0977008 0.9550015
text1368 0.3823922 0.9984180
text1369 0.3292041 0.5173374
text1370 0.3197344 1.2100382
text1371 0.2795387 0.7669258
text1372 0.4076470 2.0589848
text1373 -0.0131813 0.8686451
text1374 0.0688404 0.8297745
text1375 0.1485564 0.8968944
text1376 0.0432330 0.8297429
text1377 0.0556546 0.7573295
text1378 -0.1332549 1.1444910
text1379 0.0071120 0.6462008
text1380 -0.1051860 0.6362198
text1381 -0.0793692 0.9540759
text1382 -0.0938098 0.9521388
text1383 -0.0685183 0.6948224
text1384 0.0798438 0.6852917
text1385 0.1331828 1.2557201
text1386 0.1151865 1.2748226
text1387 0.0354510 0.6759471
text1388 0.1178050 0.6151856
text1389 -0.0283445 1.0595699
text1390 0.0720795 1.1492291
text1391 0.0589604 1.4754194
text1392 0.0552999 1.2854419
text1393 0.0730286 1.2849507
text1394 0.2034146 1.5671630
text1395 0.1239972 1.4008354
text1396 0.2458661 1.0345764
text1397 0.3306220 1.1823339
text1398 0.2337049 1.1695833
text1399 0.2175569 1.0776231
text1400 0.0178511 0.9300100
text1401 -0.0791474 1.5069781
text1402 0.1421802 0.8047680
text1403 0.1833386 1.0855366
text1404 0.2380312 1.0932471
text1405 0.2628422 1.0355318
text1406 0.1564453 1.1425330
text1407 0.2097822 1.0141008
text1408 0.3079957 0.8842809
text1409 0.2342886 0.5713105
text1410 0.2354640 1.1013814
text1411 0.2982655 1.1296259
text1412 0.2053924 0.8148815
text1413 0.1481666 0.9154565
text1414 0.2118447 0.7648488
text1415 0.2475625 1.2079999
text1416 -0.1252334 1.4979629
text1417 0.2949881 1.2997739
text1418 0.0926698 1.3807638
text1419 -0.2163907 1.1767950
text1420 0.1997122 1.8587194
text1421 -0.1252334 1.4979629
text1422 0.2949881 1.2997739
text1423 0.0926698 1.3807638
text1424 -0.2163907 1.1767950
text1425 0.1997122 1.8587194
text1426 0.0181649 0.8486403
text1427 0.0437184 0.7243652
text1428 0.0218569 0.7081860
text1429 0.1103190 1.1443191
text1430 0.0736488 0.9105840
text1431 0.1874497 1.0877694
text1432 0.1631039 0.8411965
text1433 0.2094888 0.8945053
text1434 0.0566602 1.0116768
text1435 0.1260505 0.8288960
text1436 0.0590408 0.8653443
text1437 0.0008225 0.8967156
text1438 -0.1147003 1.3919525
text1439 -0.0149748 1.1929028
text1440 0.0530561 1.3524978
text1441 0.0663368 1.3853577
text1442 0.1901536 0.9355814
text1443 0.1691438 0.9519950
text1444 0.1387692 0.6768890
text1445 0.1423398 1.0130929
text1446 0.1195749 0.7999586
text1447 0.2012514 1.1416705
text1448 0.0724750 0.9238341
text1449 0.1984612 0.8544463
text1450 0.1736812 0.6064282
text1451 0.1911875 0.4471265
text1452 0.3411170 0.5790543
text1453 0.2913202 1.5163702
text1454 0.2515595 1.8442654
text1455 0.2686687 1.1040339
text1456 0.3745814 1.6061080
text1457 0.4689638 1.7986849
text1458 0.0604288 0.8415051
text1459 0.1750548 0.8822171
text1460 0.0989377 0.8770129
text1461 0.1200942 0.7511391
text1462 0.0614363 0.8802826
text1463 0.2197314 0.8594079
text1464 0.0317272 0.5067119
text1465 0.1167504 0.9857865
text1466 -0.0207769 1.1198578
text1467 0.0837202 1.1811636
text1468 0.1062543 1.1571947
text1469 0.3842746 1.1606279
text1470 0.0043760 0.9787129
text1471 -0.0148129 1.2267749


We present the representation of the documents and use different color to represent the document in different category.

TED.de <- as.data.frame(TED.de)
TED.de.source <- TED_full %>% 
  select(2,3,4) %>% cbind(TED.de) 

ggplot(data=TED.de.source,mapping = aes(
  x=V1,
  y=V2,
  color=cate))+
  geom_point()+
  labs(x = "dimension1",
    y = "dimension2")+
  scale_colour_discrete(
    name="Category",
    breaks=c("1","2","3"),
    labels=c("AI","Climate change","Relationships")
  )

According to this plot, the documents in the “AI” and “relationships” categories covers the largest area of each other. It is possible that the documents in these two categories are more similar when compared with the documents in the “Climate change” category.

8. Supervised learning

8.1 Supervised learning: LSA on TF

For Supervised Learning, we use the results from the LSA analysis in the previous section and build a prediction model to predict the category using a random forest technique. Reducing the dimensionality of the data can be beneficial when running supervised learning algorithms because it can help to improve the performance of the algorithm by reducing the complexity of the data and to make it easier to interpret the results.

In this section, we apply the data from LSA on TF in supervised learning. We use the data frame consisting of the category and the “doc” matrix from the LSA. Then, we build the training set index based on a 80/20 split.

a <- c(1:nrow(TED.lsa.source))
row.names(TED.lsa.source) <- a
TED.lsa.source$cate <- as.factor(TED.lsa.source$cate) 

set.seed(123)
index.tr <- createDataPartition(y = TED.lsa.source$cate, p= 0.8, list = FALSE)
TED.tr <- TED.lsa.source[index.tr,]
TED.te <- TED.lsa.source[-index.tr,]

The data is unbalanced so we need to use the sub-sampling method to balance the data. The below table shows that there are 450, 327 and 401 observations for Topic 1 (AI), 2 (Climate), and 3 (Relationship) respectively.

numcate <- as.data.frame(table(TED.tr$cate))
colnames(numcate) <- c("Topic", "Count")

head(numcate) %>% flextable() %>% add_header_lines("The number of observations per Topic") %>% autofit()


After sub-sampling, the number of observations for each category is 327.

set.seed(123)
n2 <- min(table(TED.tr$cate)) ## 327

TED.tr.1 <- filter(TED.tr, cate=="1") ## the category 1
TED.tr.2 <- filter(TED.tr, cate=="2") ## the category 2
TED.tr.3 <- filter(TED.tr, cate=="3") ## the category 3
index.1 <- sample(size=n2, x=1:nrow(TED.tr.1), replace=FALSE)
index.3 <- sample(size=n2, x=1:nrow(TED.tr.3), replace=FALSE)
TED.tr.subs <- data.frame(rbind(TED.tr.1[index.1,], 
                                TED.tr.2,
                                TED.tr.3[index.3,]))

numcatesub <- as.data.frame(table(TED.tr.subs$cate))
colnames(numcatesub) <- c("Topic", "Count")

head(numcatesub) %>% flextable() %>% add_header_lines("The number of observations per Topic, after sub-sampling") %>% autofit()


We now use a random forest to predict the category from the LSA on TF. Then, the model accuracy is inspected on the test set. The results below show a confusion matrix and the associated statistics.

TED.fit <- ranger(TED.tr.subs$cate ~ ., 
                     data = TED.tr.subs[2:6])
pred.te <- predict(TED.fit, TED.te)
confusion_matrix_output <- confusionMatrix(data=pred.te$predictions, reference = TED.te$cate)

# Extract the confusion matrix, accuracy, and balanced accuracies from the output
confusion_matrix <- confusion_matrix_output$table
accuracy <- confusion_matrix_output$overall[1]
balanced_accuracies <- confusion_matrix_output$byClass[, "Balanced Accuracy"]

# Convert the confusion matrix and balanced accuracies to data frames
confusion_matrix_df <- as.data.frame.matrix(confusion_matrix)
rownames(confusion_matrix_df) <- c("Topic 1", "Topic 2", "Topic 3")
colnames(confusion_matrix_df) <- c("Topic 1", "Topic 2", "Topic 3")

confusion_matrix_df$Balanced_Accuracy <- balanced_accuracies

# Use kable to display the confusion matrix and balanced accuracies
kable(confusion_matrix_df, format = "html", align = "c", caption = "Confusion Matrix, Accuracy and Balanced Accuracy", caption.above = TRUE)  %>%
kable_styling(full_width = FALSE) %>%
  add_footnote(sprintf("Accuracy: %.5f",accuracy))
Confusion Matrix, Accuracy and Balanced Accuracy
Topic 1 Topic 2 Topic 3 Balanced_Accuracy
Topic 1 81 4 6 0.8339828
Topic 2 11 71 9 0.8911018
Topic 3 20 6 85 0.8576425
a Accuracy: 0.80887

According to the confusion matrix, the accuracy is 0.8089 and the balanced accuracy for class 1 is 0.8340, for class 2 is 0.8911, and for class 3 is 0.8576.

8.2 Supervised learning: LSA on TF-IDF

Now we use LSA on TF-IDF instead and repeat the same steps with the sub-sampling technique and the random forest model. The results below show a confusion matrix and the associated statistics.

TED.lsa2.source <- cbind(document,TED.lsa2.source)
row.names(TED.lsa2.source) <- a
TED.lsa2.source$cate <- as.factor(TED.lsa2.source$cate)

set.seed(123)
index.tr <- createDataPartition(y = TED.lsa2.source$cate, p= 0.8, list = FALSE)
TED.tr <- TED.lsa2.source[index.tr,]
TED.te <- TED.lsa2.source[-index.tr,]

n2 <- min(table(TED.tr$cate)) ## 327

TED.tr.1 <- filter(TED.tr, cate=="1") ## the category 1
TED.tr.2 <- filter(TED.tr, cate=="2") ## the category 2
TED.tr.3 <- filter(TED.tr, cate=="3") ## the category 3
index.1 <- sample(size=n2, x=1:nrow(TED.tr.1), replace=FALSE)
index.3 <- sample(size=n2, x=1:nrow(TED.tr.3), replace=FALSE)
TED.tr.subs <- data.frame(rbind(TED.tr.1[index.1,], 
                                TED.tr.2,
                                TED.tr.3[index.3,]))

TED.fit2 <- ranger(TED.tr.subs$cate ~ ., 
                     data = TED.tr.subs[2:6])
pred.te <- predict(TED.fit2, TED.te)
confusion_matrix_output <- confusionMatrix(data=pred.te$predictions, reference = TED.te$cate)

# Extract the confusion matrix, accuracy, and balanced accuracies from the output
confusion_matrix <- confusion_matrix_output$table
accuracy <- confusion_matrix_output$overall[1]
balanced_accuracies <- confusion_matrix_output$byClass[, "Balanced Accuracy"]

# Convert the confusion matrix and balanced accuracies to data frames
confusion_matrix_df <- as.data.frame.matrix(confusion_matrix)
rownames(confusion_matrix_df) <- c("Topic 1", "Topic 2", "Topic 3")
colnames(confusion_matrix_df) <- c("Topic 1", "Topic 2", "Topic 3")

confusion_matrix_df$Balanced_Accuracy <- balanced_accuracies

# Use kable to display the confusion matrix and balanced accuracies
kable(confusion_matrix_df, format = "html", align = "c", caption = "Confusion Matrix, Accuracy and Balanced Accuracy", caption.above = TRUE)  %>%
kable_styling(full_width = FALSE) %>%
  add_footnote(sprintf("Accuracy: %.5f",accuracy))
Confusion Matrix, Accuracy and Balanced Accuracy
Topic 1 Topic 2 Topic 3 Balanced_Accuracy
Topic 1 91 3 3 0.8896754
Topic 2 5 73 3 0.9317494
Topic 3 16 5 94 0.9155959
a Accuracy: 0.88055

According to the confusion matrix, The accuracy is 0.8805 and the balanced accuracy for class 1 is 0.8897, for class 2 is 0.9317, and for class 3 is 0.9156. Thus, we can conclude that the model build on features using LSA on TF-ITF has a higher accurancy than the model build on features using LSA on TF.

8.3 Supervised learning: Embedding

For the next step, we use the document embedding from the Embedding section as features in the random forest model. Before running the model, we also use the sub-sampling technique to balance the data.

The results below show a confusion matrix and associated statistics.

row.names(TED.de.source) <- a
TED.de.source$cate <- as.factor(TED.de.source$cate) 

set.seed(123)

TED.Emb.tr <- TED.de.source[index.tr,c('cate','V1','V2')]
TED.Emb.te <- TED.de.source[-index.tr,c('cate','V1','V2')]

set.seed(123)
n2 <- min(table(TED.tr$cate)) ## 327

TED.Emb.tr.1 <- filter(TED.Emb.tr, cate=="1") ## the category 1
TED.Emb.tr.2 <- filter(TED.Emb.tr, cate=="2") ## the category 2
TED.Emb.tr.3 <- filter(TED.Emb.tr, cate=="3") ## the category 3
index.1 <- sample(size=n2, x=1:nrow(TED.Emb.tr.1), replace=FALSE)
index.3 <- sample(size=n2, x=1:nrow(TED.Emb.tr.3), replace=FALSE)
TED.Emb.tr.subs <- data.frame(rbind(TED.Emb.tr.1[index.1,], 
                                TED.Emb.tr.2,
                                TED.Emb.tr.3[index.3,]))

set.seed(123)
TED.Emb.fit <- ranger(TED.Emb.tr.subs$cate~., 
                      data = TED.Emb.tr.subs)
pred.Emb.te <- predict(TED.Emb.fit, TED.Emb.te)
confusion_matrix_output <- confusionMatrix(data=pred.Emb.te$predictions, reference = TED.Emb.te$cate)


# Extract the confusion matrix, accuracy, and balanced accuracies from the output
confusion_matrix <- confusion_matrix_output$table
accuracy <- confusion_matrix_output$overall[1]
balanced_accuracies <- confusion_matrix_output$byClass[, "Balanced Accuracy"]

# Convert the confusion matrix and balanced accuracies to data frames
confusion_matrix_df <- as.data.frame.matrix(confusion_matrix)
rownames(confusion_matrix_df) <- c("Topic 1", "Topic 2", "Topic 3")
colnames(confusion_matrix_df) <- c("Topic 1", "Topic 2", "Topic 3")

confusion_matrix_df$Balanced_Accuracy <- balanced_accuracies

# Use kable to display the confusion matrix and balanced accuracies
kable(confusion_matrix_df, format = "html", align = "c", caption = "Confusion Matrix, Accuracy and Balanced Accuracy", caption.above = TRUE)  %>%
kable_styling(full_width = FALSE) %>%
  add_footnote(sprintf("Accuracy: %.5f",accuracy))
Confusion Matrix, Accuracy and Balanced Accuracy
Topic 1 Topic 2 Topic 3 Balanced_Accuracy
Topic 1 58 3 35 0.6539562
Topic 2 8 69 12 0.8787561
Topic 3 46 9 53 0.6225130
a Accuracy: 0.61433

We find that the accuracy of this model is only 0.6143 and the balanced accuracy for class 1 is 0.6540, for class 2 is 0.8788, and for class 3 is 0.6225, which are much lower than the accuracies from the previous models.

8.4 Supervised learning: LSA(TF-IDF), like and views

From the first 3 models, we find that the model with LSA (TF-IDF) has the highest accuracy. In this section, we will add additional information, including the number of likes and the number of views, as additional features in the supervised learning model.

After balancing the data, we train the model with random forest techniques and get the results as shown below.

TED.de.source <- TED.de.source %>% 
  rename(V5=V1,V6=V2)
TED.LSA.Emb <- TED.lsa2.source %>% 
  cbind(TED.de.source) %>% 
  select(-7) %>%
  select(-c('V5','V6'))

TED.Com.tr <- TED.LSA.Emb[index.tr,]
TED.Com.te <- TED.LSA.Emb[-index.tr,]

set.seed(123)
TED.Com.tr.1 <- filter(TED.Com.tr, cate=="1") ## the category 1
TED.Com.tr.2 <- filter(TED.Com.tr, cate=="2") ## the category 2
TED.Com.tr.3 <- filter(TED.Com.tr, cate=="3") ## the category 3
index.1 <- sample(size=n2, x=1:nrow(TED.Com.tr.1), replace=FALSE)
index.3 <- sample(size=n2, x=1:nrow(TED.Com.tr.3), replace=FALSE)
TED.Com.tr.subs <- data.frame(rbind(TED.Com.tr.1[index.1,],
                                    TED.Com.tr.2,
                                    TED.Com.tr.3[index.3,]))

set.seed(123)
TED.Com.fit <- ranger(TED.Com.tr.subs$cate~.,
                      data = TED.Com.tr.subs[2:8])
pred.Com.te <- predict(TED.Com.fit, TED.Com.te)
confusion_matrix_output <- confusionMatrix(data=pred.Com.te$predictions, reference = TED.Com.te$cate)

# Extract the confusion matrix, accuracy, and balanced accuracies from the output
confusion_matrix <- confusion_matrix_output$table
accuracy <- confusion_matrix_output$overall[1]
balanced_accuracies <- confusion_matrix_output$byClass[, "Balanced Accuracy"]

# Convert the confusion matrix and balanced accuracies to data frames
confusion_matrix_df <- as.data.frame.matrix(confusion_matrix)
rownames(confusion_matrix_df) <- c("Topic 1", "Topic 2", "Topic 3")
colnames(confusion_matrix_df) <- c("Topic 1", "Topic 2", "Topic 3")

confusion_matrix_df$Balanced_Accuracy <- balanced_accuracies

# Use kable to display the confusion matrix and balanced accuracies
kable(confusion_matrix_df, format = "html", align = "c", caption = "Confusion Matrix, Accuracy and Balanced Accuracy", caption.above = TRUE)  %>%
kable_styling(full_width = FALSE) %>%
  add_footnote(sprintf("Accuracy: %.5f",accuracy))
Confusion Matrix, Accuracy and Balanced Accuracy
Topic 1 Topic 2 Topic 3 Balanced_Accuracy
Topic 1 95 1 3 0.9130574
Topic 2 5 75 2 0.9464535
Topic 3 12 5 95 0.9309585
a Accuracy: 0.90444

We find that the performance of the model slightly improves. The accuracy is increased from 0.8805 to 0.9044. The balanced accuracy for class 1 is increased from 0.8897 to 0.9131, for class 2 is increased from 0.9317 to 0.9465, and for class 3 is increased from 0.9156 to 0.9310. Thus, we conclude that this model is the best performing model.

9. Conclusion

The results of sentiment analysis suggested that TED talks tend to present a positive sentiment, which may be intended to inspire and motivate audiences. On the other hand, we could observe that the results of sentiment analysis may not differentiate between different categories of documents or texts, so to some extent there is the lack of diversity in term of the videos’ sentiment.

Topic analysis was then used to cluster the TED videos and compare the results to the known categories provided by the TED website. The data visualization from the topic modeling study showed that the clusters generated by both Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) were closely aligned with the categories labeled by the TED website. Additionally, the performance of a supervised learning model that used the results of LSA as features was assessed, providing a more quantitative measure of the effectiveness of LSA in categorizing the texts.

Supervised learning techniques were applied to predict the category of TED videos using a random forest model. Four different sets of features were used for the prediction model: Latent Semantic Analysis (LSA) on Term Frequency (TF), LSA on Term Frequency-Inverse Document Frequency (TF-IDF), document embeddings, and a combination of LSA on TF-IDF and additional information such as the number of likes and views. The results showed that the last model had the highest accuracy, with an accuracy and all balanced accuracies of over 0.90 for all categories.

10. Further study and Limitation

One of the limitations of the data used in our analysis is the difficulty of obtaining a large amount of transcript data due to limitations in time and computer capacity. Additionally, the structure of the TED website is not conducive to scraping text, which hinders the richness and diversity of the data. Furthermore, the distribution of additional features on the TED website, such as information about the speakers and viewer comments, is fragmented, limiting the value of these features in our analysis.

In addition, we can expand our research based on the available information. For example, we can analyze trends in TED talk releases and predict their release schedule, which can help investors understand TED’s plans and the messages it intends to deliver to its audience. We can also analyze the variations in the wording of speeches given by the same speakers or on the same topics. Additionally, TED currently has nearly 400 options for categorizing videos, which may not be ideal for users trying to find a specific topic. Therefore, TED can improve the classification of videos by analyzing which categories viewers are more inclined to comment on in their feedback.

Reference

TED, https://www.ted.com/