Prof. Anjan Kr Dasgupta (adbioc@caluniv.ac.in)
What Einstein’s \(E=mc^2\) is to relativity theory, Heisenberg’s uncertainty principle is to quantum mechanics—not just a profound insight, but also an iconic formula that even non-physicists recognize. The principle holds that we cannot know the present state of the world in full detail, let alone predict the future with absolute precision.
his 1933 lecture “Light and life” , Niels Bohr applied an analogous uncertainty concept in biology to argue that a living being would be killed by detailed physical investigation, so there is “complementarity” between the simultaneous existence of life and the possibility of describing it scientifically.
There are some reports that says biopsies do cause spread of cancer cells, but at least one 2004 study found a correlation between “fine-needle aspiration and an increase in the incidence of sentinel node metastases.” While the two studies cannot be directly compared — one involves pancreatic cancer, the other breast cancer — the methods and analysis involved in both are similar, yet the researchers reach conflicting conclusions. http://www.medicaldaily.com/cancer-biopsy-wont-spread-tumors-data-may-be-unconvincing-317136
#The 1927 Uncertainity formulation Heisenberg inferred his formulation in 1927 via his famous thought experiment in which he imagined measuring the position of an electron using a gamma-ray microscope. The formula he derived was \[ \epsilon(q)\eta(p) \ge \frac{h}{4\pi} \]
Heisenberg offered no direct proof for this version of his principle, and expressed his ideas “only informally and intuitively”, says physicist Jos Uffink of the University of Minnesota in Minneapolis.
The relation can be formulated as \(\epsilon(q)\eta(p) \ge |<[A, B]>|\) where \(\epsilon(A)\) is the measurement precision of an observable A and \(\eta(B)\) is the disturbance that this measurement induces on another observable B.
The interesting implication stems from states for which measures of error and disturbance can be both zero \(\epsilon(A)=\eta(B)=0\), while at the same time the estimated lower bound on them is non-zero. This motivates a state-independent approach in which the error-disturbance trade-off is evaluated for a set of states that “calibrate” the measurement apparatus.
Encapsulating the strangeness of quantum mechanics is a single mathematical expression. According to every undergraduate physics textbook, the uncertainty principle states that it is impossible to simultaneously know the exact position and momentum of a subatomic particle — the more precisely one knows the particle’s position at a given moment, the less precisely one can know the value of its momentum.
Werner Heisenberg showed that as things get small, you cannot know everything precisely. Well, it turns out, as things get big, we can get confused, too.
Under scale, we lose precision. Big is hard! Time, meaning, mutual understanding, dependencies, staleness, and derivation all become a challenge. Heisenberg pointed out that at a small scale, uncertainty is a fact of life. In computing at a large scale, uncertainty is also a fact of life.http://queue.acm.org/detail.cfm?id=1988603
Simultaneity does not exist at a distance. Information speed is bound by speed of light (EPR paradox tells otherwise) . By the time we see a distant object in the night sky, it may have changed. Similarly, by the time we receive a message from a distant computer, the data contained in that system may have changed.By the time we hear about a bad news a good thing might have happened.
The Google trend line can be analyzed using an R platform using the gtrendR package. We use the package to compare the key words “Bigdata”,“Genomics”, “Bioinformatics”,“Systems Biology”.
## [1] 1044
Big data implies enormous volumes of data. It used to be employees created data. Now that data is generated by machines, networks and human interaction on systems like social media the volume of data to be analyzed is massive. Yet, Inderpal states that the volume of data is not as much the problem as other V’s like veracity.
Variety refers to the many sources and types of data both structured and unstructured. We used to store data from sources like spreadsheets and databases. Now data comes in the form of emails, photos, videos, monitoring devices, PDFs, audio, etc. This variety of unstructured data creates problems for storage, mining and analyzing data. Jeff Veis, VP Solutions at HP Autonomy presented how HP is helping organizations deal with big challenges including data variety.
Big Data Velocity deals with the pace at which data flows in from sources like business processes, machines, networks and human interaction with things like social media sites, mobile devices, etc. The flow of data is massive and continuous. This real-time data can help researchers and businesses make valuable decisions that provide strategic competitive advantages and ROI if you are able to handle the velocity. Inderpal suggest that sampling data can help deal with issues like volume and velocity.
Big Data Veracity refers to the biases, noise and abnormality in data. Is the data that is being stored, and mined meaningful to the problem being analyzed. I
Like big data veracity is the issue of validity meaning is the data correct and accurate for the intended use. Clearly valid data is key to making the right decisions. Phil Francisco, VP of Product Management from IBM spoke about IBM’s big data strategy and tools they offer to help with data veracity and validity.
Big data volatility refers to how long is data valid and how long should it be stored. In this world of real time data you need to determine at what point is data no longer relevant to the current analysis.
https://www.scientificamerican.com/article/heisenbergs-uncertainty-principle-is-not-dead/ https://www.treasury-management.com/article/4/347/2891/heisenbergs-uncertainty-principle-and-unintended-consequences-in-finance.html https://mathoverflow.net/questions/130107/is-there-an-equivalent-of-heisenbergs-uncertainty-principle-in-the-decision-sci
In the early 1900s, hobos living on the rails would frequently eat “Mulligan stew.”5 Members of the community would contribute whatever ingredients were available, and everything would be tossed into a community pot. Based on circumstances, there could be many different meats, fishes, vegetables, and other edibles all combined into the same stew. Leftovers would form the basis of tomorrow’s stew. While this dish can be delicious, it’s difficult to define its contents precisely.Many large systems extract data from a tremendous number of sources. This disparate data is processed, crunched, shoehorned, and pounded into a mush—a Mulligan stew of sorts. By combining data and reasoning about its relationship to other data, astonishing businesses have sprouted such as Google and Amazon. This section examines some of the mechanisms by which this is accomplished and the compromises needed in the process.
Each transaction shown in figure seems to be executing in a crisp and clear “now.” Some stuff happened in the past; other stuff happens in the future; the transaction sees “now.” The definition of “now” is the boundary within which a transaction can be applied. Simultaneity does not exist at a distance. Knowledge travels at the speed of light. By the time you see a distant object in the night sky, it may have changed. Similarly, by the time you receive a message from a distant computer, the data contained in that system may have changed.
All data on the Internet is from the “past.” By the time you see it, the truthful state of any changing values may be different. Each independent database, system, or application will cultivate its own data, which will change over time. In loosely coupled systems, each system has a “now” inside and a “past” arriving in messages. By the time you’ve seen some unlocked message or document, the truth may have changed. If you depend on a set of data being cohesive, it must be written by its source as a single identity within a single atomic transaction. This cohesive data unit must have a unique identity and version. In today’s large, loosely coupled systems, there will be many of these sets of data. Within each, the temporal nature can be cohesive. Across them, the application must reason about their different histories.
The paradox of census is the following. By the time we have a census data (often used by policy makers to drive their agenda), what if someone dies after you have counted him or her? What if someone is born after the family’s house is counted but before the census is complete? If the census has to be more accurate , it will take more time to evaluate and the ianccuracy will rise. If on the other hand the census is evaluated in a hurry there will be high measurement error , though the error due to new births and deaths will be less.
Single-mode fibres with low loss and a large transmission bandwidth are a key enabler for long-haul high-speed optical communication and form the backbone of our information-driven society. we demonstrate the viability of spatial multiplexing to reach a data rate of 5.1 Tbit s−1 carrier−1 on a single wavelength over a single fibre.
Given a communication channel with bandwidth of B Hz. and a signal-to-noise ratio of S/N, where S is the signal power and N is the noise power, Shannon’s formulae for the maximum channel capacity C of such a channel is \[ C = B \cdot log_2 (1 + S/N) \] For example, for a channel with bandwidth of 3 KHz and with a S/N value of 1000, like that of a typical telephone line, the maximum channel capacity is
C = 3000 * log (1 + 1000) = 30000 bps (approx.)
https://journal.austms.org.au/ojs/index.php/ANZIAMJ/article/viewFile/9414/1871 https://www.revolvy.com/main/index.php?s=Shannon%20Limit&item_type=topic http://ieeexplore.ieee.org/document/7148078/is complimentary to fluctuation in the S/N ratio.
* Increasing the levels of a signal increases the probability of an error occurring, in other words it reduces the reliability of the system. Why??
We can express the Shannon limit by the following : \[ C_{max} = \nu \]
We can now express \[ \Delta (B \cdot log_2 (1 + S/N)) = \Delta \nu = \Delta E/h \] # The other way of expressing Uncertainity \[ \Delta E \Delta t\ge \hbar \] # Making Shannon meet Heissenberg What follows is, \[ h \Delta (B \cdot log_2 (1 + S/N))\cdot \Delta t \ge \hbar \] or, \[ \Delta (B \cdot log_2 (1 + S/N))\cdot \Delta t \ge \frac{1}{2\pi} \] # Big Data - a small implication For a constant bandwidth B we can assume \[ n=B.\Delta t\] where, n is the no. of bits transferred.
What follows is: \[ n. log_2(1+S/N) \ge \frac{1}{2\pi}\] For big data we can assume : \[ n \rightarrow \infty \] \[S/N \rightarrow 0 \] # Big or better Like the thermodynamic limitation we can have an information limitation on data size and if we increase the number of accumulated bits in a unregulated way at the end there will be no signal.
We propose three thought experiments to show how “I” could be determined with absolute certainty in a living cell, assuming that, after determination of the genome sequence, the original cell is further available for tracing its behaviour, simulating or verifying predictions about its genotype/phenotype relationships, or obtaining deriva- tive cells or organisms.
Measure of uncertainty (U) about its actual sequence in a living cell, can be quantified by: \[ U\ge \mu \cdot s \] where, \(\mu\) is the mutation rate of the cell type under consideration.
https://tbiomed.biomedcentral.com/articles/10.1186/1742-4682-2-40 https://luysii.wordpress.com/tag/deep-sequencing/ http://theconversation.com/explainer-heisenbergs-uncertainty-principle-7512 http://www.utu.fi/fi/yksikot/sci/yksikot/fysiikka/tutkimus/top5/Documents/Cowen_Nature%20Heisenberg.pdf
The S/N ratio goes down as there is an information blast
The unceratinity in conformation. We report here that IDP ( intrinsically disordered proteins) which has maximum fluctuations in \(\phi\) and \(\psi\) has the least fluctuation in \(\omega\). The complementary nature in qunatum fluctuation in resonant bond angle with the angles fixed by the relatively mesoscopic interaction like hydrophobic forces may be important to note.
\[ \Delta Creativity \Delta Reproducibility\ge 1 \] PS. Physicists hate loss of possession of the qunatum principles - Hope I am not hurting this community.