Catalog Search Results
Author
Series
Description
This interdisciplinary resource on information management covers storing and transferring information, and how information is organized, accessed, interpreted, distributed, and used. It includes the subjects of computer science, library science, artificial intelligence, engineering, linguistics, psychology, mathematics of programming, and the theory of problem solving. Readers learn about documentation, cataloging and classification, and archives...
Author
Description
Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new sections, in addition to fully-updated examples, tables, figures, and a revised appendix. Intended primarily for practitioners, this book does not require...
Author
Description
Information can be conceptualized in two fundamentally yet contradictory ways_it appears in the world as both a physical and a cognitive phenomenon. The dilemma information specialists face is similar to that of physicists who must cope with light as both a wave and a particle. Unlike physics, however, information science has yet to develop a unified theory that unites the contradictory conceptions of its essential theoretical object. While there...
Series
Pub. Date
2017.
Description
"Exploring the diverse terrain that makes up library and information science (LIS), this collection features the work of scholars, practitioners, and others who draw from a variety of theoretical approaches to name, problematize, and ultimately fissure whiteness at work. Contributors not only provide critical accounts of the histories of whiteness-- particularly as they have shaped libraries and archives in higher education-- but also interrogate...
Author
Formats
Description
From the invention of scripts and alphabets to the long misunderstood "talking drums" of Africa, James Gleick tells the story of information technologies that changed the very nature of human consciousness. He also provides portraits of the key figures contributing to the inexorable development of our modern understanding of information, including Charles Babbage, Ada Byron, Samuel Morse, Alan Turing, and Claude Shannon.
Author
Pub. Date
2016.
Description
Examines how humanity records and passes on its culture to future generations, from the libraries of antiquity to the excess of information available in the digital age, and how ephemeral digital storage methods present a challenge for passing on current cultural memory to the future.
Author
Series
Description
A classic source for exploring the connections between information theory and physics, this text is geared toward upper-level undergraduates and graduate students. The author, a giant of 20th-century mathematics, applies the principles of information theory to a variety of issues, including Maxwell's demon, thermodynamics, and measurement problems. 1962 edition.
Author
Series
Great Courses volume 3
Description
How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password security to efficient binary codes to how to design a good guessing game.
Author
Series
Great Courses volume 1
Description
What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit-the basic unit of information.
Author
Series
Great Courses volume 21
Description
Enter the quantum realm to see how this revolutionary branch of physics is transforming the science of information. Begin with the double-slit experiment, which pinpoints the bizarre behavior that makes quantum information so different. Work your way toward a concept that seems positively magical: the quantum computer.
Author
Series
Great Courses volume 24
Description
Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography-designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, "What does the theory of information leave out?"
Author
Series
Great Courses volume 16
Description
Return to the concept of entropy, tracing its origin to thermodynamics, the branch of science dealing with heat. Discover that here the laws of nature and information meet. Understand the influential second law of thermodynamics, and conduct a famous thought experiment called Maxwell's demon.
Author
Series
Great Courses volume 13
Description
Learn how DNA and RNA serve as the digital medium for genetic information. Also see how shared features of different life forms allow us to trace our origins back to an organism known as LUCA-the last universal common ancestor-which lived 3.5 to 4 billion years ago.
17) Science of Information: From Language to Black Holes: Turing Machines and Algorithmic Information
Author
Series
Great Courses volume 19
Description
Contrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to Alan Turing's theoretical universal computing machine, which underlies the operation of all digital computers.
Author
Description
The theory of information emerged at the end of World War II in the 1940s. It was initiated by Claude E. Shannon through an article published in the Bell System Technical Journal in 1948, entitled A Mathematical Theory of Communication. At that time, the aim was to use communication channels more efficiently, sending a quantity of information through a given channel and measuring its capacity; the optimal transmission of messages was sought.
Author
Series
Great Courses volume 23
Description
Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore the information content of the entire universe!
Author
Description
"Have you watched videos online today? Did you post photographs on social media? Did you upload your English essay to Google docs? All of these are questions about data! ...Explore the definition of data and learn how essential it is to our everyday lives. ...Learn about the history of data, the transition from paper to computers, and the role that search engines such as Google play in handling data. By making connections between the relationships...