A recent report states that thousands of Instagram users have fallen for a scam that specifically targets followers of major financial institutions.

This report, published by social media-specializing security company ZeroFox, disclosed that over two million public Instagram posts had been found pushing a money-flipping scam.

Money-flipping is a con that involves scammers convincing victims that if they give the scammers access to their funds, they will multiply the value of those funds in return for a share of the profits. Of course, once the scammers are given access to the funds, they simply steal the money and go on their way.

Instagram has been attempting to remove money-flipping scammer accounts from its platform, but apparently every time they manage to close an account another three pop up again.

Banks often compensate affected customers, swallowing the cost of the successful frauds for now. According to ZeroFox, one major American bank has put together a team devoted to dealing with money-flipping Instagram accounts in response to having already lost over $1 million to the scam.

At this point, the heist has taken varying forms; one form, in an attempt to demonstrate the validity of the scammer to the victim, assures that whether the bank account is empty of has negative credit doesn’t even matter. In these specific cases, the scammer then uses the victim’s bank details to cash a fraudulent cheque and then deposits the cash before the bank catches on.

How does such a clear ruse manage to dupe Instagram users in such large numbers? Apparently the scammers take great pains to associate themselves with some semblance of legitimacy. Profiles full of images of fancy washes, piles of cash, and other alluring financial images are often shown.

A reporter purposefully played the stooge in a conversation with such a scammer, recording that the scammer’s back story was:

“What I do is find people who has an active bank account and the account can be negative 0 and what happens is after that I’ll look into the computer and fine some extra cash that someone hasn’t claimed and I’ll transfer it into your account.”

The scammer offered to make the reporter $15,000, of which he would like to take $3,000. He was then assured that the process would be “110% legit.”

Security company ZeroFox recommended in its report that institutions utilize machine learning technology in their efforts to combat these kinds of problems. That said, ZeroFox sells machine learning technology and some have even accused them of publishing the report as a round-about advertisement for their product.

ZeroFox went on to criticize the social media platform. John Seymour, one of its data scientists, said “It’s really easy to private message someone on Instagram… Someone can initiate a direct message, without having followed the original person.”

ZeroFox also pointed out that of the two million posts it found, over 80% of the accounts were over 45 days old, suggesting that Instagram has been fairly lax about the scam.

Facebook-owned Instagram, upon receiving the report, noted that the scams were fairly low-volume but conceded that it would look into the report’s claims and recommendations.

Virtual reality or VR headsets have been around for nearly 60 years, and as one could expect they were not very good then, and becoming pretty awesome now. They are evolving and are getting better every year. Although they have become increasingly better, there has yet to be a commercially viable application of them until possibly now.The extent of their reach has been possibly at an arcade, or some short lived home models. today what you are more likely to see them is in military training or and research lab settings surprisingly enough.

The game is changing, in the form of graphics processing and motion detection and their display technologies among some of the more prominent things. These have all gotten smaller faster and cheaper as is to be expected with any tech development.

Now lets take a look at project Morpheus and its history and development and future and where its all going and where its been. its been in the pipeline for nearly 3 years. the device isn’t just Sony’s anser to the Oculus rift, Sony was already making and developing a virtual reality theater home headset like the HMZ 3d sight device. As early as 2010 the company released a play station move motion controllers for the PS3 which was fairly ground breaking at the time. this enables for a high degree of sophisticated motion tracking and data player relays. and incase you were wondering where the name comes from, its not in reference to the matrix, at least thats what they claim. Its in reference to the ancient Greek god of dreams so there you go.

today the project morpheus prototype is a nice and clean black and white wearable VR headset. It has a 5″display LSD display with 1920 by 1080 px resolution which is seen through these “lenses”. It just needs to be noted that the industry needs a better term than lenses because it doesn’t really make sense. in lenses you assume that their is a feature of transparency. in reality its just a screen, a computer screen you look at not through so that should be looked at.

project morpheus has been created to operate with the powerful playstation 4or 5 an immsersive VR gaming experience where you can put on the headset and become the story, you become immersed in the game and ideally you are in it.

this poses some concerns for the gaming community, mainly what is enjoyable and what an entire culutre of gameing is built around is this notion that are watching a screen and holding your controller and manipulating a 3 person charecter in that way. like the nintendo wii that attempted to make interactive gaming take off, it saw limited success but the difference is that it was mostly marketed to kids, where as the morpheus is targeting the hardcore gamer. it is nieve to think there is be no growing pains or that it will even take off. although its a nice idea and will probably be the future of gaming someday, today we just dont know.

Tech companies have already succeeded in making the internet mobile; what remains to be seen is if they can make it global. An Indian telecom regulating agency recently decreed that Facebook’s Free Basics service, which offered free internet access to a select few sites (including Facebook, obviously) was too much of a censoring and monopolizing threat to be allowed in the country. Facebook CEO Mark Zuckerberg, who has always cast the service in an altruistic as opposed to internet-colonialism light, was of course, displeased.

When Facebook board member Marc Andreessen heard about the regulatory agency’s decision and tweeted the following:

“Another in a long lien of economically suicidal decisions made but he Indian government against its own citizens,” he stated. “Denying world’s poorest free partial Internet connectivity when today they have none, for ideological reasons, strikes me as morally wrong.”

Indian entrepreneur Vivek Chachra claimed that believing that Facebook’s pro-Free-Basics claim that some internet was better than no internet was simply a “justification of Internet colonialism.”

“Anti-colonialism has been economically catastrophic for the Indian people for decades. Why stop now?” Andreessen argued back.

The comment was deemed politically incorrect and profoundly offensive by the masses, leading to its deletion and Zuckerberg’s public disavowal.

“I found the comments deeply upsetting, and they do not represent the way Facebook or I think at all,” Zuckerberg responded.

“[India] has been personally important to me and Facebook,” he continued, claim ing that he was “inspired by the humanity, spirit and values of the people” when he visited the country himself.

“It solidified my understanding that, when all people have the power to share their experiences, the entire world will make progress… I’ve gained a deeper appreciation for the need to understand India’s history and culture [and] I look forward to strengthening my connection to the country.”

The situation has created a vast moral and political dilemma for Facebook, the Indian government, and the Indian people; on which Facebook would have likely preferred to be squashed by a less complicate light of internet-spreading altruism.

“[The dilemma] is kind of interesting in that it pits Zuckerberg, who is truly a leader in Facebook, against Andreessen, who trades on the perception of similar success but has actually been more of a secret failure,” commented Rob Enderle, principal analyst of the Enderle Group.

Zuckerberg has withdrawn Free Basics from India, has mandated by the telecom regulatory committee. However, he still aims to bring the internet to global communities of people of all wealth brackets by 2020. This controversy throws that aspiration into question, not only in terms of whether or not it is feasible, but in terms of whether or not it is actually altruistic.

Still, some are optimistic. Mike Jude, program manager of Stratecast/Frost & Sullivan claimed that Zuckerberg’s response “sounds sincere” and that if Facebook “takes steps to engage culturally with India and proves that it’s really trying to be sensitive to its new market, this could be good for them…especially if it leads to better behavior in other markets.”

Still, the words of Andreessen have angered enough Twitter followers, plus they seem to confirm an ulterior, predator-type motive in the shadows of Facebook’s board rooms. Whether his perspective is shared and tolerated remains to be seen.

Since 2008, people have been buying more laptops than desktops. The trend of going mobile has gone mainstream, and if you’re still attached to an old desktop computer you might want to wake up and smell the coffee. Here’s some tips for any stragglers who are finally ready to make the transition from stationary to mobile computing.

When it comes to devices you can add to your mobile repertoire, there’s a lot of different options available to you.

Let’s start with phones. If you have a landline, chances are you’re pushing 60. There’s really no need whatsoever for a landline, and you oughta get a cellphone. If you want a cellphone, that you should get a smartphone in particular because then you can do so much more than talk and text- you can check your email, listen to music, read the news, look up the definition of words you’ve never seen before, go online shopping, surf the dark web, check your fitbit stats, and, if you’ve invested in the internet of things craze that everyone is so excited about, you can even make your coffee maker start making you coffee, order your garage door to open from the other side of the planet, unlock or lock your door, start or stop heating your house, feed your dog, or start Tivoing the old game-a-roo. With a smartphone, you’ve got a computer on your hands that really should be enough to keep you coasting so long as you’re not trying to write a 20-page report on the go.

For that you might want a laptop, or a tablet with a keyboard attachment. At this point there are so many options in terms of size and processing power for laptop-mobile device hybrids that you’re likely to find exactly what you need for an affordable price. There’s something especially wonderful about bluetooth keyboards, which can be made to be super light and portable and then connect to your computer easily whenever you want, and from whatever distance works for you. It’s definitely worth checking out if you want something you can slip into your backpack and pull out whenever the time arises, plus a tablet to just hold and watch when you need it. If you think you’re going to be buckling down to do a lot of school work and typing with your device, plus you’re going to be doing video editing and playing games, you may want to opt into a laptop with higher processing power, but otherwise a tablet really does make a lot of sense.

The only catch? These devices need to be charged, and charging technology hasn’t really taken off like processing technology has. Most phones have to be charged nightly and most laptops and other high-computing devices have to be charged after 5 or 6 hours of work, tops. That can put a serious dent any our mobile capabilities, but keep in mind that there are devices that store extra charge that you can then plug your electronics into, plus lots of trains and cross-country busses do have charging capacity.

You’ve likely heard of the mysterious acronyms surrounding TV screens and monitors; LCD and CRT screens have been battling it out (with LCD generally maintaining the upper hand) for the better part of the last decade. Here’s an overview of what these two monitor options are so you can pick the option that’s right for you.

Let’s start with CRT, the old tried-and-true Cathode Ray Tube monitor. These monitors contain millions of tiny red, green, and blue phosphor dots. An electron beam is directed at these phosphor dots, causing them to glow and create a visible image.

Maybe you’ve already heard of cathodes and anodes; they’re the couple responsible for batteries’ ability to store energy and the controlled flow of electricity in general. In a battery, the anode is the positive terminal and the cathode is the negative terminal.

In a cathode ray tube, the cathode refers to the heated filament contained in a vacuum created by the glass tubes. The ray is the electron beam that hits the phosphor dots. When the cathode is heated, the electrons pour off of it and are attracted to the positively charged anode positioned in front of the cathode. The screen is coated with phosphor, which glows when it comes in contact with the electron beam.

There are several techniques for turning this reaction into a clear image. One involves a shadow mask, which is a very thin metal screen filled with tiny holes. three electron beams pass through the holes and are controlled to the extend that they can strike the correct phosphor dot at the right intensity. Another technique involves the aperture-grill, which is composed of tiny vertical wires and also blocks the electron beam to make it more accurate. The slot-mask tube combines these two technologies.

Now let’s move on to LCD monitors, or liquid crystal display monitors. LCD monitors contain two substrates (pieces of polarized glass) that enclose a layer of liquid crystal material. A backlight then shines light rays through the first substrate as electrical currents cause the liquid crystal molecules to align based on the strength of the current. Their alignment then affects what levels of light are able to pass through to the second substrate and create the colors and images displayed on the monitor.

LCD monitors can either use an active or passive matrix display. In an active display (which are more common), a thin film transistor arranges tiny transistors and capacitors in a matrix on the glass of the display. Particular pixels are reached by activating the relevant row and sending a charge to the appropriate column. In a passive matrix display, a grid of conductive metal is used to charge each pixel. They’re less expensive to produce, but their design has a tendency to be slow and imprecise, so they’re much less commonly used.

The only thing about LCD monitors is that they tend to lose their color accuracy if the viewer doesn’t see them from the right angle. Otherwise, they’re a great alternative to the outdated CRT screens.

On Wednesday, December 9th, Microsoft announced that it would be using Germany-based data centers hosted by Deutsche Telekom to offer its Azure, Office 365 and Dynamics CRM cloud services to business clients.

According to Microsoft, this was the best possible way to protect the security of its customer’s data.

The data centers themselves are run by T-Systems, a sub unit of Deutsche Telekom, which can then act as a data trustee and manage all data access.

Without an OK by T-Systems, not even Microsoft can access client data. Even given the circumstances that that permission is granted, Microsoft could only access said data under the supervision of T-Systems. Microsoft rep Jennifer Reynold explained further:

“The data trustee controls access approval to customer data by anyone other than the customer and their end users. This means that operations or other tasks that require access to customer data or the infrastructure in which customer data resides will be performed or supervised by the data trustee, or in certain instances directly by the customer. Microsoft and its subcontractors will not have access to customer data without prior approval by the customer or the data trustee.”

The announcement was made one day after Microsoft first presented its plan to expand its cloud computing services. The program is scheduled to commence sometime after June of 2016. Microsoft can’t move forward with the expansion until it is in compliance with a strict set of security standards including multifactor authentication with biometric scanning and smart cards, data encryption by SSL/TLS protocols based on German certificates, physical security controls, and protection against power outages and natural disasters.

Why outsource a cloud computing service to Germany when the United States is undergoing a Tech renaissance? The answer is simple: Ever since the European Court of Justice ruled that a 15-year-old Safe Harbor Treaty with the United States was invalid do its inability to protect the civil rights of European citizens, any U.S.-based company attempting to provide data security services to European clients has been scrambling to figure out how to prove they can truly protect the privacy of European data.

Public Knowledge vice president of legal affairs Sherwin Wiy exlained, “This looks like Microsoft making sure it can continue to do cloud business in the EU regardless of any uncertainty created by the recent Schrems Safe Harbor decision. By Making sure that the data doesn’t have to leave the EU, Microsoft reduces the possible liability for sending European’s personal data to the U.S. or elsewhere.”

Conflicts such as these illustrate the confusion of our fast-moving times; the worlds of physical borders and virtual space collide in certain legal contexts, especially when it comes to government surveillance. Countries like the United States and and the United Kingdom prioritize national security and anti-terror efforts over the privacy of its citizens, while countries like Germany see legalized government snooping as a slippery slope that could lead to the loss of civilian’s civil rights.

European clientele using American data storage services are likely uncomfortable with the idea of being snooped on by the U.S. government, so Microsoft may have just made a very tactful decision.

As of late it seems that the general beneficiaries of tech (those in charge of sales, whether they be private or business-to-business) may be actively working towards their own demise.

Now that consumers have access to the internet and thus a plethora of price quotes, reviews and general information about competing products, aspects of sales (ex/pitching the product, reeling in the customer) are becoming obsolete.

In other words, it has become commonplace for customers to already have made up their minds about products by the time they’re reaching out to salespeople, making sales a little too easy.

This trend hasn’t gone unnoticed by the head honchos running tech companies.

CRM research and advisory firm Forrester predicted that 1 million business-to-business salespeople in America were likely to lose their jobs to self-service e-commerce by 2020.

Considering there are only about 5 million B2B salespeople currently operating in the U.S., that’s a pretty big deal.

So who’s going to lose their jobs and who’s going to keep them?

Generally it depends on the simplicity of the job performed. If the salespeople’s jobs are simply to take orders, their livelihoods are likely to be replaced by automated options in the 20’s.

Alternatively, aspects of CRM (Customer Relationship Management) will likely remain an important sales field subtle enough to be helped by machines, but not fully automatable.

For example, some experts believe that the Internet of Things (IoT) era will boost salespeople’s potential to retain customers. This is because a higher quantity of objects and devices are expected to be able to communicate data about themselves and their usage, allowing for businesses to be more responsive to their customers’ needs.

On the other hand, IoT will also allow for devices to purchase things when necessary (for example, a printer may be programmed to order toner when it is running low), meaning that people can opt into long-term commitments to certain service providers simply out of convenience. This drastically reduces the necessity of salespeople to receive and process re-orders.

It is likely that the salespeople that keep their jobs will have taken it upon themselves to become product experts. The accessibility of information on the internet raises the bar in terms of how much a salesperson needs to know about a product to be useful; nowadays, the customer has already learned the preliminary information about the product he or she wants and is now concerned about more detailed differences between one product and another.

Given the accessibility of that information, customers expect salespeople to know all the information about a product off-hand; after all, the customers could just look up the answer to their questions themselves so the salesperson becomes a paid convenience.

Luckily for salespeople, CRM can also help them to be prepared experts for customers. So long as they log a history of questions along with satisfying answers, an adequate training program could be created for salespeople that want to remain relevant- and employed.

In today’s technological world, everything is computerized, in the sense any major business operation can be put in the digital form with the help of computers, networks and hardware and software to make it more professional, efficient as well as easily manageable and controllable. Going digital is the way forward and it helps to connect with the other systems or operations of related businesses easily. Computers loaded with necessary operating system and required software with the help of hardware accessories and network technology makes any operation more efficient and to run smoothly. One can easily manage even the hardest of business operations easily and efficiently by using computer networks and internet technologies.

Database management:

Any operation needs to have a database which is also called schema to store the necessary information for daily as well as future purpose. The term information technology stands for the perfect combination of computers, networks, internet technologies, tele-communications, electronic media and storage – there are the main and crucial elements for a computerized business operation. There are so many types of database software available, the premium one being Oracle from the Oracle Corp., California, USA. This is an RDBMS – Relational Database Management System where the digital data is stored in the form of relations as tables that in turn has rows and columns.

Data retrieval and manipulation:

Any digital data that is stored must be easily retrieved and manipulated. Data retrieval is searching for the particular information within the huge database and pulling it up on the screen. For this, the RDBMS language is used to query or manipulate or add or delete or modify the data present in the schema or the huge database. Data must not only be easy to retrieve and manipulate, but must be backed up on a regular basis as well. The DBA plays a crucial role in performance tuning and administering the data by taking proper copies of the data in case of emergency situations like a data damage or lost or a system crash.

Vast field:

The term information technology is a vast subject, just like an ocean – deep, wide, vast and full. The field of IT as it is often referred to as, has evolved so much over the last couple of decades or so. With so many different types of companies and their products such as hardware, software, accessories, this field has indeed grown so much that it is hard to sum up all the information in a single piece of paper. Any business, be it is tele-communications, banking, transportation such as railways, airways, or any other business operations, can be easily digitized or computerized using networks and technology. This field is so vast that even one can operate the system remotely by logging in to the network of computers from a remote place. Network safety and security plays a vital role in the sense firewalls must be set up to safe guard the network as well as data from any kind of hacking.

There’s a new technology coming to the market that can finally take the frustration out of hassling with USB ports.

It’s USB Type-C.

The common joke has always existed that you can’t connect a USB cable correctly the first time. You will spin the USB multiple times trying to plug it in, and usually it doesn’t work until you inspect it (a joke once made by Intel). USB Type-C is meant to be a universal and reversible connector, and may even increase throughput by double. For manufacturers, it can be installed with previously used USB standards (meaning the expensive 3.0 hardware will not need to be included).

While for consumers, the switch to USB 3.0 was easy due to color coordination, USB Type-C does not. Though, interestingly enough, it’s small enough to possibly fit in tablets and even phones. This would be huge for manufacturers, as it may finally provide consumers the ability to use the same data cables across the board for all of their devices. Not only that, but they can possibly provide up to 100W of power. That’s an enormous jump from the simple 10W from before.

Unfortunately, there’s still heavy competition with Thunderbolt and their high speed. Type-C may offer higher bandwidth, but the fact that the connector is flippable will most likely be the largest selling point for consumers. It was more recently revealed that Type-C can support simultaneous transport for DisplayPort 1.3, meaning that HDMI 2.0 and DVI can be used with an adapter (it has been confirmed though the Type-C will not be able to support DisplayPort Dual Mode). Regardless, this is finally the solution that everyone has been waiting for, and the final kicker to bring USB out on top of Thunderbolt. It has already been off to a slow start anyway.

Thunderbolt hardware is expensive, nearly $100 for the controllers, and USB controllers can cost a manufacturer just a couple of dollars. New Thunderbolt enabled products are teetering hundreds of dollars above others built with USB or even eSATA. It’s the same issue they had with FireWire. You only Thunderbolt devices where the price of manufacturing it can be absorbed by the already high list price of the product (like cameras, high-end audio and visual equipment, etc.). Even Hewlett-Packard, currently the largest manufacturer of PCs, opted for USB over Thunderbolt.

Thunderbolt (which is essentially just a copper wire version of Light Peak) may possibly be increasing their wattage in order to catch up, but Alternate Modes for Type-C is something they haven’t been able to touch. This connector will most likely be used in newer phone models immediately, as they are much easier to include in a design as opposed to trying to fit an MHL or DisplayPort socket. With this new connector, you can attach a digital camera to your computer, and use the same cord to attach that to a power supply, or your tablet to your desktop, or your phone to a monitor. The possibilities are endless.

With this technology, it’s finally possible that a single cable can be used for everything. With technological advances like this, who’s to say what they’ll do next?

Computers are electronic devices that have replaced manpower in more ways than one are used extensively for storing, retrieving and process voluminous data to finally produce information that is readable to the user. Hence these devices work in conjunction with hardware and software components, interdependent on each other at all times.

Hardware Analysis:

While there are innumerable ways to address a hardware problem, the best approach would be to go about resolving the same with a pragmatic approach by reaching out to the hard drive data recovery services, and not getting carried away by various online quick fixes.

Backups and Updates – critical activities

Information which is considered as wealth makes it imperative for data backup to become the most important and periodic activity.

Updating the system with the latest system updates including important firmware and driver fixes matching the hardware will help a great deal. In addition, a new kernel that will better support the hardware can be incorporated into the system.

Understanding the fundamentals of computer hardware can aid in identifying and rectifying a problem.

Confirming a failed hard drive

Most of the hard drives come with free diagnostic tools that will help you run a diagnostic test to determine whether or not the problem is due to a failed hard drive. Scanning your computer for any malware by running the intrinsic virus protection software of your computer; will help you detect any infections of adware or spyware that you have installed.

Optimizing your computer and defragmenting by running a disc defragmenter is yet another option for troubleshooting a failed hard drive. The simplest thing that you could always do is to check and ensure that the hard drive is properly connected to the correct components of your system.

Data recovery software is the one stop practical solution for recovering lost data from your failed hard drive.

Latest Data recovery Software Provide Additional Features

Despite the popularity of storage devices, people are always faced with problems concerning loss of data. Whatever may be the cause of data loss, there is data recovery software that comes to rescue during unforeseen exigencies of information loss. While some of the software is available free online, plenty of options for paid software come with additional features. It is interesting to note that this data recovery software is available for Apple Mac, personal computers and even mobiles, thus lending its hands to disaster recovery.

While there is no promise of 100% perfection when it comes to data recovery software a failure on the part of best recovery software crops up in the event a file has been partially overwritten. Thus the chances of recovery are pretty low in such cases when it was long since the file was deleted. However, chances are bright to recover a lost file which has recently been deleted, all thanks to data recovery software. So, what are you waiting for? If you have witnessed a hardware error on your computer system or laptop, it is time you look up for the hard drive data recovery software services to get your problem solved.