Post List

Computer Science / Engineering posts

(Modify Search »)

  • January 1, 2012
  • 09:41 AM
  • 808 views

Copyright vs Medicine: If this topic isn’t covered in your newspaper this weekend, get a new newspaper

by Neurobonkers in Neurobonkers

According to the New England Journal of Medicine, after thirty years of silence, authors of a standard clinical psychiatric bedside test have issued take down orders of new medical research.... Read more »

Newman, J., & Feldman, R. (2011) Copyright and Open Access at the Bedside. New England Journal of Medicine, 365(26), 2447-2449. DOI: 10.1056/NEJMp1110652  

  • December 21, 2011
  • 11:15 AM
  • 594 views

Invisibility Cloaks: Now you see it, Now you don't.

by Char in Basal Science (BS) Clarified

Invisibility is something we often see on TV shows and in comic books. We’ve always thought it’s unattainable, at least not in the foreseeable future. However, two groups of scientist have shown this past year two promising methods of achieving “invisibility” using some advanced materials.... Read more »

Gharghi M, Gladden C, Zentgraf T, Liu Y, Yin X, Valentine J, & Zhang X. (2011) A carpet cloak for visible light. Nano letters, 11(7), 2825-8. PMID: 21619019  

Zhang B, Luo Y, Liu X, & Barbastathis G. (2011) Macroscopic invisibility cloak for visible light. Physical review letters, 106(3), 33901. PMID: 21405275  

Chen X, Luo Y, Zhang J, Jiang K, Pendry JB, & Zhang S. (2011) Macroscopic invisibility cloaking of visible light. Nature communications, 176. PMID: 21285954  

  • December 20, 2011
  • 01:04 AM
  • 872 views

Who Benefits from Serious Gaming?

by Dr Shock in Dr Shock MD PhD

Buffer Multimodality or using a combination of visual, auditory, haptic,and other sensory modalities in the presentation of knowledge in serious gaming improves learning outcome. Interactivity or the communication between player and the digital gaming system in serious gaming also improves learning outcome. But these are two design elements and not psychological attributes of users of [...]
No related posts.... Read more »

Lee, Y., Heeter, C., Magerko, B., & Medler, B. (2011) Gaming Mindsets: Implicit Theories in Serious Game Learning. Cyberpsychology, Behavior, and Social Networking, 2147483647. DOI: 10.1089/cyber.2011.0328  

  • December 16, 2011
  • 10:00 AM
  • 4,177 views

Polygon rectangulation, part 3: Minimum-length rectangulation

by Aaron Sterling in Nanoexplanations

In this third (and final) post on polygon rectangulation, I will consider how to find the rectangulation of minimum total length for an orthogonal polygon.  In part one of this short series, we considered rectangulations with a minimum number of … Continue reading →... Read more »

Andrzej Lingas, Ron Y. Pinter, Ronald R. Rivest, & Adi Shamir. (1982) Minimum edge length partitioning of rectilinear polygons. Proceedings of the 20th Allerton Conference on Communication, 53-63. info:/

  • December 12, 2011
  • 03:10 PM
  • 4,942 views

The First Quantum Computer

by The Astronomist in The Astronomist.

In a nondescript office park outside Vancouver with views of snow capped mountains in the distance is a mirrored business park where very special work is being done. The company is D-Wave, the quantum computing company. D-Wave's mission is to build a computer which will solve humanity's grandest challenges.D-Wave aims to develop the first quantum computer in the world, perhaps they already have. The advent of quantum computers would be a sea change in the world that would allow for breaking of cryptography, better artificial intelligence, and exponential increases in computing speed for certain applications. The idea for quantum computers has been bubbling since Richard Feynman first proposed that the best way to simulate quantum phenomena would be with quantum systems themselves, but it has been exceedingly difficult to engineer a computer than can manipulate the possibilities of quantum information processing. Hardly a decade ago D-Wave began with a misstep which is the origin of their name. D-Wave got its name from their first idea which would have used yttrium barium copper oxide (YBCO) which is a charcoal looking material with a superconducting temperature above that of the boiling point of liquid nitrogen. This means that YBCO is the standard science lab demonstration of superconducting magnetic levitation. Ultimately the crystalline structure of YBCO was found to be an imperfect material, but the cloverleaf d-wave atomic orbital that lends YBCO its superconducting properties stuck as D-Wave's name. The vision of D-Wave did not change, but their approach did. They realized they would have to engineer and build the majority of the technology necessary to create a quantum computer themselves. They even built built their own superconducting electronics foundry to perform the electron beam lithography and metallic thin film evaporation processes necessary to create the qubit microchips at the heart of their machine.I recently got to visit D-Wave, the factory of quantum dreams, for myself. The business park that D-Wave is in is so nondescript that we drove right by it at first. I was expecting lasers and other blinking lights, but instead our University of Washington rented van pulled into the wrong parking lot which we narrowly reversed out of. In the van were several other quantum aficionados, students, and professors, mostly from computer science who were curious at what a quantum computer actually looks like. I am going to cut the suspense and tell you now that a quantum computer looks like a really big black refrigerator or maybe a small room. The chip at the heart of the room is cooled to a few milikelvin, colder than interstellar space, and that is where superconducting circuits count electric quantum sheep. The tour began with us milling around a conference room and our guide, a young scientist and engineer, was holding in his hand a wafer which held hundreds of quantum processors. I took a picture and after I left that conference room they did not let me take any more pictures.Entering the laboratory it suddenly dawned on me that this wasn't just a place for quantum dreams it was real and observable. The entire notion of a quantum computer was more tangible. A quantum computer is a machine which uses quantum properties like entanglement to perform computations on data.The biggest similarity between a quantum computer and a regular computer is that they both perform algorithms to manipulate data. The data, or bits, of a quantum computer are known as qubits. A qubit is not limited to the values of 0 or 1 as in a classical computer but can be in a superposition of these states simultaneously. Sometimes a quantum computer doesn't even give you the same answer to the exact same question. Weird. The best way to conceive of a quantum computing may be to imagine a computation where each possible output of the problem has either positive or negative probability amplitudes (a strange quantum idea there) and when the amplitudes for wrong answers cancel to zero and right answers are reinforced.The power of quantum computers is nicely understood within the theoretical framework of computational complexity theory. Say for example that I give you the number 4.60941636 × 1018 and ask for the prime factors of this number. Now if someone were to give you the prime factors you could verify them as correct very quickly, but what if I asked you to generate the prime factors for me (I dare you. I have the answer. I challenge you). The quintessential problem here is the P versus NP question which asks whether if a problem can be verified quickly can it also be solved quickly. Quickly is defined as polynomial time meaning that the algorithm scales as the number of some inputs to some power. Computational complexity theory basically attempts to categorize different kinds of problems depending on how fast a solution can be found as the size of the problem grows. A P class problem is one in which the solution can be found within polynomial time. A NP class problem is one in which the solution can be verified in polynomial time. So if I ask you for the prime factors of my number above that is an NP problem because given the numbers you could verify the answer quickly, but it would be very difficult to calculate the numbers just given the number. It is an open question, but it appears likely that all P problems are a subset of NP. This means that problems verifiable in polynomial time are not necessarily solved in polynomial time. The issue is that for some very interesting problems in the real world we could verify the answer if we stumbled upon it, but we won't even be able stumble upon the answer in a time shorter than the age of the universe with current computers and algorithms. What we know we know and what we think we know is a sea of confusion, but the popular opinion and where people would take their wagers is that P is not equal to NP.Suddenly, with mystique and spooky actions at a distance, quantum computing comes swooping in and claims to be able to solve some NP problems and all P problems very quickly. A general quantum computer would belong to the complexity class of BQP. There is a grand question at hand, is BQP in NP? (More generally, is BQP contained anywhere in the polynomial hierarchy? The polynomial hierarchy is a complexity class which generalizes P and NP problems to a particular kind of perfect abstract computer with the ability to solve decision problems in a single step. See this paper here on BQP and the Polynomial Hierarchy by Scott Anaronson who is a outspoken critic of D-Wave) At this time we cannot even claim to have evidence that BQP is not part of NP, but most scientists close to the problem think that BQP is not a subset of NP. Quantum computing researchers are trying to get better evidence that quantum computers cannot solve NP-complete problems in polynomial time (if NP was a subset of BQP then the polynomial hierarchy collapses). A reasonable wager I would take is that P is a (proper) subset of BQP and BQP is itself is a (proper) subset of NP. This claim has not been rigorously proved but it is suspected to be true and further there are some NP problems which it has been shown to be true for such as prime factorization and some combinatoric problems.There might be an elephant in the room here. The D-Wave architecture is almost certainly attacking a NP complete problem and reasonable logic says that quantum computers will solve P problems and some NP problems, but not NP complete problems (this is also not proven, but suspected). An NP complete problem is a problem in which the time it takes to compute the answer may reach into millions or billions of years even for moderately large versions of the problem. Thus we don't know if this particular quantum computer D-Wave has built even allows us to do anything efficiently we couldn't already do on a classical computer efficiently; it doesn't seem to be a BQP class computer thus it cannot for example solve prime factorization cryptography problems. So, yes it is a quantum machine, but we don't have any evidence it is an interesting machine. At the same time we don't have any evidence it is an uninteresting machine. It is not general purpose enough to be clear it a a big deal, nor is it so trivial it is totally uninteresting.The D-Wave lab was bigger than I expected and it was at once more cluttered and more precise than I thought it would be. It turns out the entire process of quantum computing follows this trend. There are a lot of factors they contend with and on the tour I saw people dead focused with their eyes on a microscope executing precise wiring, coders working in pairs, theoreticians gesturing at a chaotic white board, and even automated processes being carried on by computers with appropriately looking futuristic displays. The engineering problems D-Wave faces include circuit design, fabrication, cryogenics, magnetic shielding and so on. There is too much to discuss here so I will focus on what I think are scientifically the two most interesting parts of the D-Wave quantum computer which are the qubit physics and the quantum algorithm which they implement; in fact these two par... Read more »

Harris, R., Johansson, J., Berkley, A., Johnson, M., Lanting, T., Han, S., Bunyk, P., Ladizinsky, E., Oh, T., Perminov, I.... (2010) Experimental demonstration of a robust and scalable flux qubit. Physical Review B, 81(13). DOI: 10.1103/PhysRevB.81.134510  

Harris, R., Johnson, M., Han, S., Berkley, A., Johansson, J., Bunyk, P., Ladizinsky, E., Govorkov, S., Thom, M., Uchaikin, S.... (2008) Probing Noise in Flux Qubits via Macroscopic Resonant Tunneling. Physical Review Letters, 101(11). DOI: 10.1103/PhysRevLett.101.117003  

  • December 12, 2011
  • 12:16 PM
  • 4,330 views

Lessons learned from the abalone shell

by Cath in Basal Science (BS) Clarified

Sometimes the answers are right in front of your eyes. In a quest to develop lightweight structural materials with excellent strength and toughness (damage resistance), researchers have turned to the abalone shell for inspiration.
... Read more »

Munch E, Launey ME, Alsem DH, Saiz E, Tomsia AP, & Ritchie RO. (2008) Tough, bio-inspired hybrid materials. Science (New York, N.Y.), 322(5907), 1516-20. PMID: 19056979  

Ritchie RO. (2011) The conflicts between strength and toughness. Nature materials, 10(11), 817-22. PMID: 22020005  

  • December 10, 2011
  • 01:06 PM
  • 485 views

The next big thing? Automated methods in biology, or "Hooked on phenomics"

by zacharoo in Lawn Chair Anthropology

"This is very beautiful. It is neat, it is modern technology, and it is fast. I am just wondering very seriously about the biological validity of what we are doing with this machine." - Melvin Moss, 1967*"This machine" to which Moss referred nearly 50 years ago was not a contraption to clone a Neandertal or a Godzilla-like MechaGodzilla, but a computer. Along these lines, a paper came out recently describing a new, automated method for analyzing (biological) shapes, and while I think the method is pretty sweet, I think future researchers employing it should keep Moss's monition in mind.Doug Boyer and colleagues (2011) present "Algorithms to automatically quantify the geometric similarity of anatomical surfaces." It seems the main goals of the study were to make shape analysis [1] faster and [2] easier for people who don't otherwise study anatomy (such as geneticists), making it possible [3] to amass large phenotypic datasets comparable to the troves of genetic data accumulated in recent years. Using some intense math that's way over my head, the computer algorithm takes surface data (acquired through CT or laser scans) of a pair of specimens and automatically fits these forms with a "correspondence map" linking geometrically (and not necessarily biologically) homologous features between the two. It then uses the map to fit landmarks (a la geometric morphometrics) which are used to calculate the shape difference metric between individuals in the pairings.See at the right just how pretty it is! The authors posit that this technique could be used with genetic knock-out studies to assess how certain genes affect the development of bones and teeth, or to model the development of organs. That certainly would be useful in biomedical and evo-devo research.But while I appreciate the automated-ness of the procedure, I don't think we can simply write off the role of the biologist in determining what features are homologous, in favor of a computer. The paper itself illustrates this nicely. The authors state that there is debate about the origins of a cusp on the molar tooth of the sportive lemur (Lepilemur) - is it the same as the entoconid of the living mouse lemur, or the enlarged metaconid of the extinct "koala lemur"? Their automated algorithm can map the sportive lemur's mystery cusp to match either alternative scenario. It is the external paleontological and phylogenetic evidence, not the intrinsic shape information, that renders the alternative scenario more plausible.So let me reiterate that I think this paper presents an important step for the study of the biology of form, or the form of biology. Automating the analysis of form will certainly expedite studies of large datasets (not to mention freeing up the time of hordes of research assistants). But I hope that researchers employing this procedure will have a little Mossian Angel (poor play on "guardian angel," sorry) on their shoulders, reminding them that the algorithm won't necessarily show them homology better than their own experience. And I hope all biologists have this Mossian Angel there, reminding them that even though this method is "neat ... modern technology, and ... fast," it may not be the most appropriate method for their research question.ReferencesBoyer, D., Lipman, Y., St. Clair, E., Puente, J., Patel, B., Funkhouser, T., Jernvall, J., & Daubechies, I. (2011). Algorithms to automatically quantify the geometric similarity of anatomical surfaces Proceedings of the National Academy of Sciences, 108 (45), 18221-18226 DOI: 10.1073/pnas.1112822108*This quote comes from a discussion at the end of a symposium: Cranio-Facial Growth in Man (1967). RE Moyers and WM Krogman, editors. New York: Pergamon Press.... Read more »

Boyer, D., Lipman, Y., St. Clair, E., Puente, J., Patel, B., Funkhouser, T., Jernvall, J., & Daubechies, I. (2011) Algorithms to automatically quantify the geometric similarity of anatomical surfaces. Proceedings of the National Academy of Sciences, 108(45), 18221-18226. DOI: 10.1073/pnas.1112822108  

  • December 9, 2011
  • 11:25 AM
  • 636 views

Can Software help Health care?

by Aurametrix team in Health Technologies

Apps, apps and more apps. Software is everything and everything runs on software. Almost every industry in the U.S. has been disrupted by software. The health care field is not one of them. Easily accessible consumer information makes everyone a little bit doctor. Emerging portable diagnostic devices will strengthen the transition. Are we up to it? Not yet. ... Read more »

Archer N, Fevrier-Thomas U, Lokker C, McKibbon KA, & Straus SE. (2011) Personal health records: a scoping review. Journal of the American Medical Informatics Association : JAMIA, 18(4), 515-22. PMID: 21672914  

  • December 9, 2011
  • 10:00 AM
  • 759 views

Polygon rectangulation, part 2: Minimum number of fat rectangles

by Aaron Sterling in Nanoexplanations

This post is the second in a series on polygon rectangulation. In my previous post, I discussed methods to decompose an orthogonal polygon into a minimum number of rectangles.  (See that post for definitions and motivation.)  In my next post, … Continue reading →... Read more »

J. O'Rourke, & G. Tewari. (2002) Partitioning orthogonal polygons into fat rectangles in polynomial time. Canadian Conference on Computational Geometry, 97-100. info:/

  • December 8, 2011
  • 06:13 PM
  • 730 views

Somatic Mutation Detection in Whole Genome Sequencing Data

by Daniel Koboldt in Massgenomics

A paper online at Bioinformatics describes our flagship algorithm for detecting somatic point mutations in whole-genome sequencing of tumor samples. This freely available software package, called SomaticSniper, performs a Bayesian comparison of the genotype likelihoods in tumor and normal samples at every [covered] position in the genome. Overview Documentation Install The study includes a detailed [...]... Read more »

Larson, DE., Harris, CC., Chen, K., Koboldt, DC., Abbott, TE., Dooling, DJ., Ley, TJ., Mardis, ER., Wilson, RK., & Ding, L. (2011) SomaticSniper: Identification of Somatic Point Mutations in Whole Genome Sequencing Data. Bioinformatics, 1. info:/doi: 10.1093/bioinformatics/btr665

  • December 8, 2011
  • 08:00 AM
  • 773 views

Burn your tables

by Zen Faulkes in Better Posters

A new paper provides empirical evidence for something that many people, like Edward Tufte, have been saying for years: graphs and figures are better than tables.Cook and Teo took the results of statistical simulations, and presented them to people in the form of graphs or tables. Everyone were able to answer questions about the data more quickly using a graph. Less experienced people (i.e., undergraduates compared to postgraduates) were able to make more accurate statements about the results when presented in a table rather than a graph.They note that many journals print tables that make matters even worse. Tables often have far too many significant digits, and readers are often asked to make comparisons horizontally, rather than vertically.If you are thinking of putting a table on your poster: burn it.ReferenceCook A, Teo S. 2011. The communicability of graphical alternatives to tabular displays of statistical simulation studies. PLoS ONE 6(11): e27974. DOI: 10.1371/journal.pone.0027974Photo by cranky messiah of Flickr; used under a Creative Commons license.... Read more »

  • December 5, 2011
  • 04:16 PM
  • 720 views

Breaking out of the mold—new recyclable plastic that’s stable when heated

by Cath in Basal Science (BS) Clarified

What do you think of when you hear the word plastic? Flexible? Recyclable? Strong?

We typically associate material characteristics with their applications. Plastics used in safety helmets have very good impact resistance but those used for plastic bottles are chemically resistant. Even with the wide range of plastics available, recyclable plastics are typically softer than their harder non-recyclable counterparts. Wouldn’t it be great if we could pick and choose the properties we want in a material; for example, a durable plastic that’s also recyclable?... Read more »

Montarnal D, Capelot M, Tournilhac F, & Leibler L. (2011) Silica-like malleable materials from permanent organic networks. Science (New York, N.Y.), 334(6058), 965-8. PMID: 22096195  

  • December 5, 2011
  • 11:47 AM
  • 699 views

Techie Ecstasy: Preview of SIGGRAPH Asia 2011

by Nsikan Akpan in That's Basic Science

Highlights from the forthcoming tech fest SIGGRAPH Asia 2011

1) An Iphone app that writes your autobiography from internet posts

2) A Cartography graphics tool for cardiologists

3) A Renaissance collage generator.

Plus a video of some great rendering software innovations.... Read more »

Zhu, B., Iwata, M., Haraguchi, R., Ashihara, T., Umetani, N., Igarashi, T., & Nakazawa, K. (2011) Sketch-based Dynamic Illustration of Fluid Systems. ACM Transactions on Graphics, 30(6), 1. DOI: 10.1145/2070781.2024168  

Huang, H., Zhang, L., & Zhang, H. (2011) Arcimboldo-like collage using internet images. ACM Transactions on Graphics, 30(6), 1. DOI: 10.1145/2070781.2024189  

  • December 5, 2011
  • 01:46 AM
  • 1,001 views

Students use of social media

by Dr Shock in Dr Shock MD PhD

Buffer We have been trying to use twitter during lectures, especially since the group is so large about 100 to 200 can only follow the lecture from another lecture room on a monitor. It wasn’t a success, the question time during and after the lecture was hardly used. Only 8-10 questions were proposed via twitter. [...]


No related posts.... Read more »

Giordano C, & Giordano C. (2011) Health professions students' use of social media. Journal of allied health, 40(2), 78-81. PMID: 21695367  

  • December 4, 2011
  • 07:31 AM
  • 1,046 views

why computers can’t predict revolutions

by Greg Fish in weird things

Back in September, news worldwide reported the results of a paper which claimed that a supercomputer had a knack for predicting revolutions and key global events, able to pick up on the events of Tahir square in Cairo and even get a fix on Osama bin Laden’s location. After reviewing the paper in question, I [...]... Read more »

Leetaru, K. (2011) Culturomics 2.0: Forecasting large-scale human behavior using global news media tone in time and space. First Monday, 16(9). info:/

  • December 2, 2011
  • 04:13 PM
  • 645 views

Polygon rectangulation, part 1: Minimum number of rectangles

by Aaron Sterling in Nanoexplanations

Over the next few posts, I will consider problems of polygon rectangulation: given as input an orthogonal polygon (all interior angles are 90 or 270 degrees), decompose into adjacent, nonoverlapping rectangles that fully cover .  Different problems impose different conditions … Continue reading →... Read more »

W.T. Liou, J.J.M. Tan, & R.C.T. Lee. (1989) Minimum Partitioning Simple Rectilinear Polygons in O(n log log n) Time. Symposium on Computational Geometry, 344-353. info:/10.1145/73833.73871

FERRARI, L., SANKAR, P., & SKLANSKY, J. (1984) Minimal rectangular partitions of digitized blobs. Computer Vision, Graphics, and Image Processing, 28(1), 58-71. DOI: 10.1016/0734-189X(84)90139-7  

  • December 1, 2011
  • 11:30 AM
  • 745 views

The privacy payoff

by David Bradley in Sciencetext

What aspects of online privacy do we worry about most? Seemingly, there are three major concerns: The first is that on the internet, data about us is collected permanently, of which we then lose control. Secondly, we are often poorly informed about how the data we provide is going to be used. Thirdly, handing over [...]Post from: David Bradley's Sciencetext Tech TalkThe privacy payoff
... Read more »

Irene Pollach, & Horst Treiblmaier. (2011) The influence of privacy concerns on perceptions of web personalisation. Int. J. Web Science, 1(1/2), 3-20. info:/

  • November 30, 2011
  • 09:10 AM
  • 690 views

Technology experts can detect if images are photoshopped or not

by United Academics in United Academics

Angelina Jolie rinkle free? Penelope Cruz a D-cup? Not anymore. Scientists have come up with a way to detect if photographs of celebrities or models have been airbrushed or not. Now they hope it will be used to provide a universal “health warning” on magazine images.... Read more »

Kee E, & Farid H. (2011) A perceptual metric for photo retouching. Proceedings of the National Academy of Sciences of the United States of America. PMID: 22123980  

  • November 30, 2011
  • 08:25 AM
  • 961 views

Video Tip of the Week: Phosida, a post-translational modification database

by Trey in OpenHelix

Over 2 years ago I did a tip of the week on Phosida. Phosida is a database of phosphorylation, acetylation, and N-glycosylation data. Since the last tip, Phosida has undergone significant growth and some changes, including the addition of much more data (80,000 phosphorylation, acetylation and N-glycosylated sites from 9 different species) and tools (prediction and [...]... Read more »

Gnad, F., Gunawardena, J., & Mann, M. (2010) PHOSIDA 2011: the posttranslational modification database. Nucleic Acids Research, 39(Database). DOI: 10.1093/nar/gkq1159  

  • November 30, 2011
  • 02:42 AM
  • 673 views

Technology experts can detect if images are photoshopped or not

by United Academics in United Academics

Angelina Jolie rinkle free? Penelope Cruz a D-cup? Not anymore. Scientists have come up with a way to detect if photographs of celebrities or models have been airbrushed or not. Now they hope it will be used to provide a universal “health warning” on magazine images.... Read more »

Kee E, & Farid H. (2011) A perceptual metric for photo retouching. Proceedings of the National Academy of Sciences of the United States of America. PMID: 22123980  

join us!

Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.

If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.

Register Now

Research Blogging is powered by SMG Technology.

To learn more, visit seedmediagroup.com.