Friday, April 20, 2007

Lab 10

Hartley and Shannon both have different ways of measuring uncertainty. Hartley's measure refers to the amount of uncertainty associated with a set of alternatives. This is measured by the amount of information needed to remove the uncertainty. It refers to the information revealed after we know the outcome. This is given by the Hartley function which is H(A):=logb(׀A׀) where if the log base is 2, then the uncertainty is measured in bits. If is it the natural logarithm, then the unit is
nat. Shannon entropy, on the other hand, is the average amount of uncertainty associated with a set of weighted alternatives measured by the average amount of information needed to remove the uncertainty. The main difference is that shannon entropy refers to averages with a random variable. Shannon entropy shows the average amount of infomation that the recipient is missing when they do not know the value of a random variable, whereas Hartley is how much information needed to remove the uncertainty.

Flowchart

Friday, April 6, 2007

Lab 9

I learned a lot in this lab about making calculations using Excel. The lab forced me to calculate things on my own before using the computer. It was much easier using the computer's data analysis tool. This shows how advanced technology is becomming and how helpful it can be. Data analysis is very helpful in transforming data with the aim of extracting useful information. This includes statistical analysis and curve fitting. It is very useful in finding patterns in data. This can be used in many real life examples. There are relationships between many variables. Some real life examples that we discussed in class are years of schooling vs. level of income, high-school vs college gpa, and inflation rate vs prime lending rate. There are also different types of relationships such as direct and inverse linear. Scattter diagrams around linear relationships help show whether a diagram is more accurate estimator of x and y or a less accurate estimate. Comparing the data you have collected to the linear model in the diagram can account differences between actual values and an estimated value. Overall, this lab taught me a lot about analyzing information and i found it very useful.

Friday, March 30, 2007

Friday, March 9, 2007

Lab7


This does in fact prove De Morgan's Law because although it looks different it is still doing the same thing for each output. It as if we were taking away the parantheses and distributing the signs among the different parts of the equation. If you would run each one without putting them on the same page it would give you the same results.

Lab 7


This screen shot shows you the relationships between the numbers on the logic gate. If the numbers are the same, it yields a true result and if they are different it is false.

Friday, February 23, 2007

Lab 6 Post

I found that the binary number of 110010101 is equal to 810 in decimal. I did this just by following the lecture slides. I used the Binary Arithmetic rules. These are just simple addition rules used to convert binary to decimal. For the second question, i found that the decimal number 529 is equal to 1000010001. I found this just by following the process of finding the quotient and then remainder. This week, i also learned a lot about positional and the non-positional number system. The positional system is means the value of each digit is determined by its position. Non-positional means the value is not affected by its position.