Apple makes great computers, no question about it, but they are expensive. Another alternative are Samsung computers.

“The Samsung – Apple battle continues with Samsung coming in second to Apple but holding a higher rank than others due mostly to superior PCs like the sleek, elegant, and powerful ArtPC PULSE” says RESCUECOM’s David Milman. Read more »

Using the Internet safely may seem like a simple thing, but many have discovered that this is not always the case. Almost everyone knows about the dangers of malware, spyware, and viruses, but there is one Internet threat on the rise that can be extremely difficult for technicalservices to deal with called “Ransomware.” Read more »

Everyone knows what WiFi is, or at least what it does for them. It is something more people use now than ever before in human history, and the number continues to grow. It has pervaded everyone’s life and brought us the world in a way never possible before the modern age of technology. WiFi gives us the potential to see and know more than we ever thought we could and communicate with people around the world with which we would never have had contact as recently as a few decades ago. What exactly is WiFi, though? Many people who know the ins and outs of what it does would find this a difficult question to answer, as simple as it seems. The dictionary defines it as “a facility allowing computers, smartphones, or other devices to connect to the Internet or communicate with one another wirelessly within a particular area.” Most think it basically is the Internet, and almost everyone knows it at least as something allowing us to connect to it. There are many nuances and specifics to it, however, ones this definition cannot really clarify. Its purpose, certainly, is to allow us to connect to the world of the Internet which allows us to see, hear, and know more than previously imaginable. It allows us to do this anywhere at any time, as long as a connection is possible. It is the specifics which can confuse people, and only the more technically-minded can really know exactly how WiFi works, the difference between it and the Internet, its history, how it can evolve in the future, and its various applications. If you are not a tech-savvy person, chances are you do not understand these things, but it is really not particularly difficult once you know the essentials. Read more »

The term “self-driving car” and the thought of one probably conjures up thoughts in most peoples’ minds of someone eating, reading, or even sleeping while their car drives them where they want to go. They may think mainly of Tesla and Uber, arguably the two largest companies most heavily in the development and application of autonomous, or self-driving, cars. Like many technologies, this one has roots much earlier than many would think. Unlike many technologies developed over a long stretch of time and only being perfected today, however, vehicular autonomy, or the ability of a vehicle to drive itself without human assistance, did not have the potential to exist at the time it was imagined. While the idea existed, the technology available simply did not allow for its application. This is particularly interesting given the normal course of technological progress wherein an idea develops and a very crude prototype created, and inventors continue to create more advanced prototypes until technology allows for a much more useable application or even perfection. Another thing making the case of autonomous vehicular technology even more interesting is the case of the airplane. Lawrence Sperry, the head of the Sperry Corporation, an aviator, and the son of then-famous inventor Elmer Sperry, invented autopilot for airplanes in 1912. This was not even a decade after the Wright brothers’ famous first successful flight at Kittyhawk, North Carolina in 1903. Read more »

Like many innovations just coming to fruition, smart glasses are a technology people first began dreaming of long ago. Wearable tech has always fascinated people, though until the modern age of technology it was impossible to pull off, and the idea of a wearable display has been arguably the most fascinating idea to people for some time. In fact, while Thelma McCollum’s name might not be one everyone knows or thinks of when they think back on great inventors, she actually has patents for dozens of new technologies in the 1940s and beyond. One of these was a patent for a device she referred to as a “stereoscopic television apparatus.” This was in 1945 and, sadly, became largely forgotten. It was not until some fifteen years later or so, in 1960, anybody really began taking the idea of a wearable display with any seriousness. While nothing really came to a real fulfilment, Morton Heilig fascinated people with the idea. Nobody at the time could believe it was even possible but, sure enough, Heilig applied for and received a patent for his own, less bulky, version of a head-mounted display. He followed this up in 1962 with his patent for the “Sensorama Simulator,” a virtual reality simulator complete with handlebars, a binocular display, a vibrating seat, stereophonic speakers, cold air blower, and a device close to the nose which would generate odors according to the action in the film. The intention of this device was to completely immerse a film viewer in every aspect of the film in a way we still cannot quite accomplish, but it was a bold effort to be sure. Obviously, the sheer bulk of such a device made its use impractical en masse, but he did create his dream via an arcade-style machine which played a 3D film along with stereo sound, vibration, smells, and wind. With this machine, he achieved his dream of creating a completely immersive sensory environment, but it would be several more years before the dream of making a truly wearable device would occur. Read more »

Motion capture is not new to film, nor is it any longer a particularly difficult thing to do. Simply put, motion capture (mocap or mo-cap for short), is the process of recording the movement of objects or people. This has applications beyond filmmaking including many fields such as sports, government, military, medical, and robotics. Its main use to date, however, had been in the enhancement of films using computer technology to either add realism or place elements it would not be safe, practical, or financially advantageous to do otherwise. While referred to as motion tracking when used for other purposes, in filmmaking and games this process more commonly refers to “match moving.” This is a cinematic technique allowing for the insertion of computer graphics into live-action footage and takes more skill to do so with the correct position, scale, orientation, and motion. This means that, at its most advanced, only high budget films can achieve truly professional results and must rely on professional cinematographers as well as a huge staff of animators. It can take a lot of money and a great deal of skill to get the cleanest, most realistic results as well as to insert computerized elements into a film that are all perfect relative to all of the objects or actors in the shot. This does not mean that others cannot use it effectively, though that has been the case in years past. Now, with advancements in tech bringing the necessary software to utilize motion capture to users commercially on a much larger scale, it is possible for more and more less professional videographers to use without needing the money or professional expertise formerly needed. Not only that, but evolving tech makes it much easier and faster to use to a smaller extent. Read more »

Photography has existed for over 150 years, and photography has been a hobby everyone could enjoy for around 100. The term “computational photography,” (also called “computational imaging”) of course, is a much more modern term that goes far beyond what traditional photography has to offer. Its purpose is to overcome the limitations of traditional photography using tech. It utilizes the newest developments in photographic technology to improve everything from optics and sensors to composition, style, and more. Its intent is simply to improve the way photographers process, manipulate, and interact with arguably the most simple and basic form of visual media available to us today. While some still prefer the raw nature of traditionally taken photos, computational photography is a means of which everyone can take advantage to improve the quality of the photos they take. This means it has application for those who do not know the technical aspects of photography as well as for professionals, and especially in the professional realm. It provides obvious advantages in business, web design, presentation, the legal field, medicine, and much more. While as a hobby it may have its detractors, there is no denying the benefit it could provide when the quality of clarity what one can see in an image can make a very important determination. Read more »

3D printing is something that most people would have considered impossible as recently as fifty years ago. In fact, the concept was not something anyone had even begun to put into motion until the early 1980s. As technology progresses, however, we have not only discovered to means for 3D printing but have continued to improve upon it. The question for some, though, is this: what exactly is 3D printing? Simply put, 3D printing is a means of creating a physical object from a digital design using a specialized machine. There are different 3D printing technologies and materials that we can now print with, but all of these work based on the same principle: the program turns a digital model into a solid three-dimensional physical object by adding material layer by layer. If this seems amazing, consider that it is now becoming somewhat commonplace at schools, libraries, and similar institutions while some companies have even created models for use in the home. Not only that, but the process has been further advanced with the creation of 3D pens, with dozens of models now available on the market. Before going into detail, though, there are some other more basic facts to consider about 3D printing and its many implications. Read more »

Companies have continually striven toward combining better picture and portability in a larger size in the design of TVs. Simply put, bigger is better and making it lightweight so it is easy to carry and set up is certainly a bonus. It takes great leaps in TV technology to be able to continue to make TVs larger while not only maintaining but drastically improving the picture without making them impossible to move. Thus, it is something toward which every TV brand is continuously striving, and now a new innovation some never thought possible and is, in fact, the stuff of sci-fi fantasy, is now available on the consumer market. In 2016, LG debuted an 18-inch TV that was mere centimeters thin, the screen of which could be folded or rolled up. At the time, the display was good but not exceptional, but all this is new and improved in a greater achievement: the new LG Design 65-inch 4K OLED rollable TV. Read more »

Voice activation of computer technology, beginning with using human speech to get a computer to understand a function you want it to perform and developing from there, has been a dream for many years but a practicality much more recently. The potential applications are almost innumerable, and one cannot exaggerate the helpfulness of technology that you do not need to sit directly next to, hold in your hand, or even touch. IBM’s Shoebox, introduced at the 1962 Seattle World’s Fair, was able to recognize sixteen spoken words and the digits 0 to 9. Now, nearly everyone carries a device with them daily that can perform dozens of tasks, if not more, via voice command. Read more »

“Buzzwords” are very popular in the tech industry, but these often vague and uninformative terms are not always clear to consumers. One of these terms is “the cloud,” one that everyone knows but of which fewer can explain the particulars. To put it simply, it refers to software and services that operate on the Internet rather than your computer, which you can access in a browser like Microsoft Edge, Firefox, or Google Chrome as well as some dedicated mobile apps. When your data is in the cloud, or you work in the cloud, that data goes somewhere for storage. In the case of the cloud, is goes to numerous locations from which a network of servers can find and deliver it when needed. With remote servers handling most computing and storage, you can do any work on any computer and access it from any other computer with an Internet connection at any time. Read more »