Friday, February 23, 2007
Lab 6 Post
I found that the binary number of 110010101 is equal to 810 in decimal. I did this just by following the lecture slides. I used the Binary Arithmetic rules. These are just simple addition rules used to convert binary to decimal. For the second question, i found that the decimal number 529 is equal to 1000010001. I found this just by following the process of finding the quotient and then remainder. This week, i also learned a lot about positional and the non-positional number system. The positional system is means the value of each digit is determined by its position. Non-positional means the value is not affected by its position.
Friday, February 16, 2007
Lab 5 Unix Commands
The first command I chose is "whoami". This command gives you the current user name. It relates to Windows/DOS because you can access user information and profile settings from the start menu.
The second command I chose is "more". This command scrolls through a file. This relates to Windows/DOS when you are just pressing the up and down arrows to look through a document.
The third command I selected is "rm". This removes a file. It relates to the "delete" key in the Windows/DOS format. This just gets rid of a file.
The final command I chose is "cd". This is moving to another directory. It relates to the Windows/DOS when you are moving from folder to folder or directory to directory.
The second command I chose is "more". This command scrolls through a file. This relates to Windows/DOS when you are just pressing the up and down arrows to look through a document.
The third command I selected is "rm". This removes a file. It relates to the "delete" key in the Windows/DOS format. This just gets rid of a file.
The final command I chose is "cd". This is moving to another directory. It relates to the Windows/DOS when you are moving from folder to folder or directory to directory.
Chapter 6
Chapter 6
After reading chapter 6, I feel much more knowledgeable on “global swarming”. The first thing that I found important was the discussion of recommender systems. They are computer programs that attempt to predict items such as movies, music, books, news, and web pages. These are all things that a user might be interested in and they give information about the user’s profile. This is how it is used in Amazon.com because they sell all of these types of items. Recommender systems are very important and useful in this industry of online retail. They are also used in social networking sites such as Facebook. This is done through collaborative filtering. Collaborative filtering is the method of filtering the interests of a user by collecting taste information from many users. This is exactly what Facebook does. In Facebook, a user is able to put down his or her interests and then search for others in their network with similar interests. The chapter also got into how search engines work. They mine the knowledge implicit in multiple trails that structure the web. The first search engines relied on very simple forms of first- order heuristic search. Web search engines work by storing information about a large number of web pages, which they retrieve from the WWW itself. This is done using a Web crawler, also known as a spider. It follows every link it sees. When a user comes to the search engine and makes a query, by using key words, the engine looks up the index and provides a listing of the best matching web pages. Search Engines are only useful if the result given is relevant. This is why most search engines rank their results in order of relevance. The reading also presents a metaphor using the digital world and the biological world. All ants and insects are forced to chose paths. This is very similar to search engines and the world wide web. Search engines take you on a path to where you want to go. Overall, I learned a lot from this reading about the web. There are things I take fro granted and I never think about how they work. This reading helped me to understand these things.
After reading chapter 6, I feel much more knowledgeable on “global swarming”. The first thing that I found important was the discussion of recommender systems. They are computer programs that attempt to predict items such as movies, music, books, news, and web pages. These are all things that a user might be interested in and they give information about the user’s profile. This is how it is used in Amazon.com because they sell all of these types of items. Recommender systems are very important and useful in this industry of online retail. They are also used in social networking sites such as Facebook. This is done through collaborative filtering. Collaborative filtering is the method of filtering the interests of a user by collecting taste information from many users. This is exactly what Facebook does. In Facebook, a user is able to put down his or her interests and then search for others in their network with similar interests. The chapter also got into how search engines work. They mine the knowledge implicit in multiple trails that structure the web. The first search engines relied on very simple forms of first- order heuristic search. Web search engines work by storing information about a large number of web pages, which they retrieve from the WWW itself. This is done using a Web crawler, also known as a spider. It follows every link it sees. When a user comes to the search engine and makes a query, by using key words, the engine looks up the index and provides a listing of the best matching web pages. Search Engines are only useful if the result given is relevant. This is why most search engines rank their results in order of relevance. The reading also presents a metaphor using the digital world and the biological world. All ants and insects are forced to chose paths. This is very similar to search engines and the world wide web. Search engines take you on a path to where you want to go. Overall, I learned a lot from this reading about the web. There are things I take fro granted and I never think about how they work. This reading helped me to understand these things.
Thursday, February 8, 2007
Response to "Modeling the World"
I find it interesting how far information goes back. The article discusses how physics was the first set of precise rules dating all the way back to Aristotle. He was the first to relate symbols to the external world. He came up with mathematical rules to explain why certain things happen in the world. The notes then go into discussing what a model is. It defines it as any complete and consistent set of verbal arguments, mathematical equations, or computational rules which is thought to correspond to some other entity. This pretty much means that a model describes how somehting works. The main purposes of modeling are for data analysis, interpretation, control, prediction, and understanding. It is basically just to help with understanding. The next thing that is necesary to understand models is how they are created. They cannot exist without experimentation so it is always needed to perform tests. This is how information is gained. Overall, I found this reading very interesting and it helped me because I feel like sometimes I take advantage of simple aspects of life. This helped me realize where it all comes from.
Subscribe to:
Posts (Atom)