HDP Solution is a professional IT solutions provider, specializing in developing mobile applications. We are constantly trying to develop and innovate with the desire to bring great and comprehensive applications to create a diverse application ecosystem to meet all the needs of our customers.
Business areas of HDP Solution:
– iOS / Android / Windows Phone Application Development.
– Website, Server Development.
– Game Development.
– Embedded Software Development.
6 USES OF INFORMATION TECHNOLOGY IN EDUCATION
6 USES OF INFORMATION TECHNOLOGY IN EDUCATION
Both education and learning are life time processes, they have no limit on when to start and stop. In our daily lives we learn new things and this helps us in changing the way we leave. Education provides us with information, and then we have to learn and process this information for our own use. It is very important to make education accessible at any time by every one; this will help in reducing on the level of illiteracy. Information technology has the ability of speeding up information delivery, so this ability can be used in improving our education environment. With the implementation of Information Technology, costs of accessing educational material are cut down and it makes it easy for students to learn from anywhere.
New technologies are changing the way we learn and they have also changed the process of teaching. Both teachers and students are using these new educational technologies to archive specific academic goals. The only challenge is that Information Technology comes at a cost, so those who can not afford the price tend to have difficulties to benefit from the opportunities of Information Technology in education.For example; the increased use of internet broadband makes it easy for students to access academic information on time. Also teachers use this broadband internet to created and deliver academic data using videos and graphic illustrations.
Below are some detailed points on the use of information technology in education and schools.
- Plenty of Educational Resources: Information technology makes it easy to access academic information at any time. Both students and teachers use Information technology to acquire and exchange educational material. For example; teachers can easily provide visual and audio classes to their students using computers and broadband internet. This breaks the boundaries of accessing information, because the student will simply attend a lecture while not in a physical classroom. Also teachers can assign tasks to students via electronic email or campus educational forums.
- Instant Access to Educational Information: Information technology speeds the transfer and distribution of information. Students can easily access academic data using computers and new technologies like mobile phone application. IT experts have coded educational applications which can be used by students to access information very fast. These mobile phone applications are replacing some old methods like borrowing of books in libraries, now days students can use Library mobile phone applications to download books inform of e-books, so they have these books at any time which saves them time and helps them read at any time anywhere.
- Full Time learning: Unlike in the past when learning was limited to a physical classroom, students and teachers could only access academic information while at school. Today , all that has changed, a student will access information at any given time of the day. It does not matter where they are or the time of the day is. Information technology has facilitated online education, so you will find a student in Africa will study the same course as a student in USA or India. And when it comes to getting jobs, all these students will have the chance to compete for the same job.
- Group Learning: Information technology has helped students learn in groups and it has also helped teachers teach students in groups. In past, we used to have group discussions at school which required each member to contribute, however the shy students could stay away from these groups because of the fear of expressing them selves. Now with information technology, schools have created academic forums, where students can discuss about a specific topic with no fear of expression. They can also engage in video and text chatting. Another benefit of these online group discussions is that not all group members will be from the same class or school as it was in the past. Students from various schools around the world can be in the same academic group and share academic information. Lets look at a student in Africa joining a group discussion of students in Harvard University, the information exchanged will be beneficial to this student.
- Use of Audio – Visual: Information technology has changed the way we learn and interpret information. The use of audio-visual education helps students learn faster and easily. As opposed to text and blackboard notes, students get bored in this form of education. It is a human weakness, people do not want to read text for so long, they get bored, so the introduction of audio-visual technology in education, makes students enjoy what they’re learning. Visual illustration using images on projectors helps a student understand the concept, because some of these images are interesting and they look familiar to a student. Our brains tend to remember visual illustrations easily more than text. This explains why you can easily remember someone’s face but fail to remember their names.
- Long Distance learning: Information technology enables students across the globe to study from anywhere through online education. This has been possible due to the wide spread of cheap broadband internet in both developed and non-developed countries. Unlike in the past, when some courses used to be provided in developed countries, so for a student to study those courses they had to go through the hassle of moving from their home country which was to expensive. Now days, a student can have access to these courses online. Many universities have opened their educational curriculum to the world. So at a small fee, a student can be part of that university. These students do the same exams and are marked by the same teachers.
Everyday Examples of Artificial Intelligence and Machine Learning
Everyday Examples of Artificial Intelligence and Machine Learning
With all the excitement and hype about AI that’s “just around the corner”—self-driving cars, instant machine translation, etc.—it can be difficult to see how AI is affecting the lives of regular people from moment to moment. What are examples of artificial intelligence that you’re already using—right now?
In the process of navigating to these words on your screen, you almost certainly used AI. You’ve also likely used AI on your way to work, communicating online with friends, searching on the web, and making online purchases.
We distinguish between AI and machine learning (ML) throughout this article when appropriate. At TechEmergence, we’ve developed concrete definitions of both artificial intelligence and machine learning based on a panel of expert feedback. To simplify the discussion, think of AI as the broader goal of autonomous machine intelligence, and machine learning as the specific scientific methods currently in vogue for building AI. All machine learning is AI, but not all AI is machine learning.
Our enumerated examples of AI are divided into Work & School and Home applications, though there’s plenty of room for overlap. Each example is accompanied with a “glimpse into the future” that illustrates how AI will continue to transform our daily lives in the near future.
Examples of Artificial Intelligence: Work & School
According to a 2015 report by the Texas Transportation Institute at Texas A&M University, commute times in the US have been steadily climbing year-over-year, resulting in 42 hours of rush-hour traffic delay per commuter in 2014—more than a full work week per year, with an estimated $160 billion in lost productivity. Clearly, there’s massive opportunity here for AI to create a tangible, visible impact in every person’s life.
Reducing commute times is no simple problem to solve. A single trip may involve multiple modes of transportation (i.e. driving to a train station, riding the train to the optimal stop, and then walking or using a ride-share service from that stop to the final destination), not to mention the expected and the unexpected: construction; accidents; road or track maintenance; and weather conditions can constrict traffic flow with little to no notice. Furthermore, long-term trends may not match historical data, depending on the changes in population count and demographics, local economics, and zoning policies. Here’s how AI is already helping to tackle the complexities of transportation.
1 – Google’s AI-Powered Predictions
Using anonymized location data from smartphones, Google Maps (Maps) can analyze the speed of movement of traffic at any given time. And, with its acquisition of crowdsourced traffic app Waze in 2013, Maps can more easily incorporate user-reported traffic incidents like construction and accidents. Access to vast amounts of data being fed to its proprietary algorithms means Maps can reduce commutes by suggesting the fastest routes to and from work.
2 – Ridesharing Apps Like Uber and Lyft
How do they determine the price of your ride? How do they minimize the wait time once you hail a car? How do these services optimally match you with other passengers to minimize detours? The answer to all these questions is ML.
Engineering Lead for Uber ATC Jeff Schneider discussed in an NPR interview how the company uses ML to predict rider demand to ensure that “surge pricing”(short periods of sharp price increases to decrease rider demand and increase driver supply) will soon no longer be necessary. Uber’s Head of Machine Learning Danny Lange confirmed Uber’s use of machine learning for ETAs for rides, estimated meal delivery times on UberEATS, computing optimal pickup locations, as well as for fraud detection.
3 — Commercial Flights Use an AI Autopilot
AI autopilots in commercial airlines is a surprisingly early use of AI technology that dates as far back as 1914, depending on how loosely you define autopilot. The New York Times reports that the average flight of a Boeing plane involves only seven minutes of human-steered flight, which is typically reserved only for takeoff and landing.
Glimpse into the future
In the future, AI will shorten your commute even further via self-driving cars that result in up to 90% fewer accidents, more efficient ride sharing to reduce the number of cars on the road by up to 75%, and smart traffic lights that reduce wait times by 40% and overall travel time by 26% in a pilot study.
The timeline for some of these changes is unclear, as predictions vary about when self-driving cars will become a reality: BI Intelligence predicts fully-autonomous vehicles will debut in 2019; Uber CEO Travis Kalanick says the timeline for self-driving cars is “a years thing, not a decades thing”; Andrew Ng, Chief Scientist at Baidu and Stanford faculty member, predicted in early 2016 that self-driving carswill be mass produced by 2021. On the other hand, The Wall Street Journal interviewed several experts who say fully autonomous vehicles are decades away. TechEmergence also discussed the timeline for a self-driving car with Eran Shir, CEO of AI-powered dashcam app Nexar, who believes virtual chauffeurs are closer than we think.
1 – Spam Filters
Your email inbox seems like an unlikely place for AI, but the technology is largely powering one of its most important features: the spam filter. Simple rules-based filters (i.e. “filter out messages with the words ‘online pharmacy’ and ‘Nigerian prince’ that come from unknown addresses”) aren’t effective against spam, because spammers can quickly update their messages to work around them. Instead, spam filters must continuously learn from a variety of signals, such as the words in the message, message metadata (where it’s sent from, who sent it, etc.).
It must further personalize its results based on your own definition of what constitutes spam—perhaps that daily deals email that you consider spam is a welcome sight in the inboxes of others. Through the use of machine learning algorithms, Gmail successfully filters 99.9% of spam.
2 – Smart Email Categorization
Gmail uses a similar approach to categorize your emails into primary, social, and promotion inboxes, as well as labeling emails as important. In a research paper titled, “The Learning Behind Gmail Priority Inbox”, Google outlines its machine learning approach and notes “a huge variation between user preferences for volume of important mail…Thus, we need some manual intervention from users to tune their threshold. When a user marks messages in a consistent direction, we perform a real-time increment to their threshold.” Every time you mark an email as important, Gmail learns. The researchers tested the effectiveness of Priority Inbox on Google employees and found that those with Priority Inbox “spent 6% less time reading email overall, and 13% less time reading unimportant email.”
Glimpse into the future
Can your inbox reply to emails for you? Google thinks so, which is why it introduced smart reply to Inbox in 2015, a next-generation email interface. Smart reply uses machine learning to automatically suggest three different brief (but customized) responses to answer the email. As of early 2016, 10% of mobile Inbox users’ emails were sent via smart reply. In the near future, smart reply will be able to provide increasingly complex responses. Google has already demonstrated its intentions in this area with Allo, a new instant messaging app which can use smart reply to provide both text and emoji responses.
Grading and Assessment
1 –Plagiarism Checkers
Many high school and college students are familiar with services like Turnitin, a popular tool used by instructors to analyze students’ writing for plagiarism. While Turnitin doesn’t reveal precisely how it detects plagiarism, research demonstrates how ML can be used to develop a plagiarism detector.
Historically, plagiarism detection for regular text (essays, books, etc.) relies on a having a massive database of reference materials to compare to the student text; however, ML can help detect the plagiarizing of sources that are not located within the database, such as sources in foreign languages or older sources that have not been digitized. For instance, two researchers used ML to predict, with 87% accuracy, when source code had been plagiarized. They looked at a variety of stylistic factors that could be unique to each programmer, such as average length of line of code, how much each line was indented, how frequent code comments were, and so on.
The algorithmic key to plagiarism is the similarity function, which outputs a numeric estimate of how similar two documents are. An optimal similarity function not only is accurate in determining whether two documents are similar, but also efficient in doing so. A brute force search comparing every string of text to every other string of text in a document database will have a high accuracy, but be far too computationally expensive to use in practice. One MIT paper highlights the possibility of using machine learning to optimize this algorithm. The optimal approach will most likely involve a combination of man and machine. Instead of reviewing every single paper for plagiarism or blindly trusting an AI-powered plagiarism detector, an instructor can manually review any papers flagged by the algorithm while ignoring the rest.
Essay grading is very labor intensive, which has encouraged researchers and companies to build essay-grading AIs. While their adoption varies among classes and educational institutions, it’s likely that you (or a student you know) has interacted with these “robo-readers’ in some way. The Graduate Record Exam (GRE), the primary test used for graduate school, grades essays using one human reader and one robo-reader called e-Rater. If the scores differ substantially, a second human reader is brought in to settle the discrepancy. This addresses the primary concern with robo-readers: if students can deduce the heuristics e-Rater’s use for determining their grade, they could easily exploit them to write nonsensical essays that would still score highly. This hybrid approach contrasts with how the ETS handles the SAT, where two human graders evaluate essays and a third is brought in if the scores differ substantially between the two humans. The synergistic approach in the former shows that by pairing human intelligence with artificial intelligence, the overall grading system costs less and accomplishes more.
Glimpse into the future
There are many promising avenues for AI to improve education in the future. One-size-fits-all classes may be replaced by personalized, adaptive learning that is tailored to each student’s individual strength and weaknesses. ML may also be used to identify at-risk students early on so that schools can focus extra resources on those students and decrease dropout rates.
One of TechEmergence’s most popular guides is on machine learning in finance. While the guide discusses machine learning in an industry context, your regular, everyday financial transactions are also heavily reliant on machine learning.
1 – Mobile Check Deposits
Most large banks offer the ability to deposit checks through a smartphone app, eliminating a need for customers to physically deliver a check to the bank. According to a 2014 SEC filing, the vast majority of major banks rely on technology developed by Mitek, which uses AI and ML to decipher and convert handwriting on checks into text via OCR.
2 – Fraud Prevention
How can a financial institution determine if a transaction is fraudulent? In most cases, the daily transaction volume is far too high for humans to manually review each transaction. Instead, AI is used to create systems that learn what types of transactions are fraudulent. FICO, the company that creates the well-known credit ratings used to determine creditworthiness, uses neural networks to predict fraudulent transactions. Factors that may affect the neural network’s final output include recent frequency of transactions, transaction size, and the kind of retailer involved.
3 – Credit Decisions
Whenever you apply for a loan or credit card, the financial institution must quickly determine whether to accept your application and if so, what specific terms (interest rate, credit line amount, etc.) to offer. FICO uses ML both in developing your FICO score, which most banks use to make credit decisions, and in determining the specific risk assessment for individual customers. MIT researchers found that machine learning could be used to reduce a bank’s losses on delinquent customers by up to 25%.
Glimpse into the future
Can a robot give you sound investing advice? That’s the premise behind upstarts like Wealthfront and Betterment, which attempt to automate the best practices of seasoned investors and offer them to customers at a much lower cost than traditional fund managers. In early 2016, Wealthfront announced it was taking an AI-first approach, promising “an advice engine rooted in artificial intelligence and modern APIs, an engine that we believe will deliver more relevant and personalized advice than ever before.” While there is no data on the long-term performance of robo-advisors (Betterment was founded in 2008, Wealthfront in 2011), they will become the norm for regular people looking to invest their savings. This is already happening with younger people—in the above announcement, Wealthfront notes that 60% of its customers are under the age of 35.
Examples of Artificial Intelligence: Home
1 – Facebook
When you upload photos to Facebook, the service automatically highlights faces and suggests friends to tag. How can it instantly identify which of your friends is in the photo? Facebook uses AI to recognize faces. In a short video highlighting their AI research (below), Facebook discusses the use of artificial neural networks—ML algorithms that mimic the structure of the human brain—to power facial recognition software. The company has invested heavily in this area not only within Facebook, but also through the acquisitions of facial-recognition startups like Face.com, which Facebook acquired in 2012 for a rumored $60M, Masquerade (2016, undisclosed sum), and Faciometrics (2016, undisclosed sum).
Facebook also uses AI to personalize your newsfeed and ensure you’re seeing posts that interest you, as discussed in a TechEmergence interview with Facebook’s Hussein Mehanna. And, of particular business interest to Facebook is showing ads that are relevant to your interests. Better targeted ads mean you’re more likely to click them and buy something from the advertisers—and when you do, Facebook gets paid. In the first quarter of 2016, Facebook and Google secured a total of 85% of the online ad market—precisely because of deeply-targeted advertisements.
In June 2016, Facebook announced a new AI initiative: DeepText, a text understanding engine that, the company claims “can understand with near-human accuracy the textual content of several thousand posts per second, spanning more than 20 languages.” DeepText is used in Facebook Messenger to detect intent—for instance, by allowing you to hail an Uber from within the app when you message “I need a ride” but not when you say, “I like to ride donkeys.” DeepText is also used for automating the removal of spam, helping popular public figures sort through the millions of comments on their posts to see those most relevant, identify for sale posts automatically and extract relevant information, and identify and surface content in which you might be interested.
2 – Pinterest
Pinterest uses computer vision, an application of AI where computers are taught to “see”, in order to automatically identify objects in images (or “pins”) and then recommend visually similar pins. Other applications of machine learning at Pinterest include spam prevention, search and discovery, ad performance and monetization, and email marketing.
3 – Instagram
Instagram, which Facebook acquired in 2012, uses machine learning to identify the contextual meaning of emoji, which have been steadily replacing slang (for instance, a laughing emoji could replace “lol”). By algorithmically identifying the sentiments behind emojis, Instagram can create and auto-suggest emojis and emoji hashtags. This may seem like a trivial application of AI, but Instagram has seen a massive increase in emoji use among all demographics, and being able to interpret and analyze it at large scale via this emoji-to-text translation sets the basis for further analysis on how people use Instagram.
4 – Snapchat
Snapchat introduced facial filters, called Lenses, in 2015. These filters track facial movements, allowing users to add animated effects or digital masks that adjust when their faces moved. This technology is powered by the 2015 acquisition of Looksery (for a rumored $150 million), a Ukranian company with patents on using machine learning to track movements in video.
Glimpse into the future
Facebook is betting that the future of messaging will involve conversing with AI chatbots. In early 2015, it acquired Wit.ai, an engine that allows developers to create bots that easily integrate natural language processing into their software. A few months later, it opened its messenger platform to developers, allowing anyone to build a chatbot and integrate Wit.ai’s bot training capability to more easily create conversational bots. Slack, a social messaging tool typically used in the workplace, also allows third parties to incorporate AI-powered chatbots and has even investedin companies that make them. Soon, your shopping, errands, and day-to-day tasks may be completed within a conversation with an AI chatbot on your favorite social network.
Your Amazon searches (“ironing board”, “pizza stone”, “Android charger”, etc.) quickly return a list of the most relevant products related to your search. Amazon doesn’t reveal exactly how its doing this, but in a description of its product search technology, Amazon notes that its algorithms “automatically learn to combine multiple relevance features. Our catalog’s structured data provides us with many such relevance features and we learn from past search patterns and adapt to what is important to our customers.”
You see recommendations for products you’re interested in as “customers who viewed this item also viewed” and “customers who bought this item also bought”, as well as via personalized recommendations on the home page, bottom of item pages, and through email. Amazon uses artificial neural networks to generate these product recommendations.
While Amazon doesn’t reveal what proportion of its sales come from recommendations, research has shown that recommenders increase sales (in this linked study, by 5.9%, but in other studies recommenders have shown up to a 30% increase in sales) and that a product recommendation carries the same sales weight as a two-star increase in average rating (on a five-star scale).
3 – (More) Fraud Protection
Machine learning is used for fraud prevention in online credit card transactions. Fraud is the primary reason for online payment processing being more costly for merchants than in-person transactions. Square, a credit card processor popular among small businesses, charges 2.75% for card-present transactions, compared to 3.5% + 15 cents for card-absent transactions. AI is deployed to not only prevent fraudulent transactions, but also minimize the number of legitimate transactions declined due to being falsely identified as fraudulent.
In a press release announcing the rollout of its AI technology, MasterCard noted that 13 times more revenue is lost to false declines than to fraud. By utilizing AI that can learn your purchasing habits, credit card processors minimize the probability of falsely declining your card while maximizing the probability of preventing somebody else from fraudulently charging it.
Glimpse into the future
The key to online shopping has been personalization; online retailers increase revenue by helping you find and buy the products you’re interested in. We may soon see retailers take it one step further and design your entire experience individually for you. Google already does this with search, even with users who are logged out, so this is well within the realm of possibility for retailers. Startups likeLiftIgniter offer “personalization as a service” to online businesses. Others, like Optimizely, allow businesses to run extensive “A/B tests”, where businesses can run multiple versions of their sites simultaneously to determine which results in the most engaged users.
A standard feature on smartphones today is voice-to-text. By pressing a button or saying a particular phrase (“Ok Google”, for example), you can start speaking and your phone converts the audio into text. Nowadays, this is a relatively routine task, but for many years, accurate automated transcription was beyond the abilities of even the most advanced computers. Google uses artificial neural networks to power voice search. Microsoft claims to have developed a speech-recognition system that can transcribe conversation slightly more accurately than humans.
2 – Smart Personal Assistants
Now that voice-to-text technology is accurate enough to rely on for basic conversation, it has become the control interface for a new generation of smart personal assistants. The first iteration were simpler phone assistants like Siri and Google Now (now succeeded by the more sophisticated Google Assistant), which could perform internet searches, set reminders, and integrate with your calendar.
Amazon expanded upon this model with the announcement of complimentary hardware and software components:
- Alexa, an AI-powered personal assistant that accepts voice commands to create to-do lists, order items online, set reminders, and answer questions (via internet searches)
- Echo (and later, Dot) smart speakers that allow you to integrate Alexa into your living room and use voice commands to ask natural language questions, play music, order pizza, hail an Uber, and integrate with smart home devices.
Microsoft has followed suit with Cortana, its own AI assistant that comes pre-loaded on Windows computers and Microsoft smartphones.
Glimpse into the future
Smart assistants will be the key to bridging the gap between humans and “smart” homes. In October 2016, Google announced Google Home—its competitor to Amazon Echo that features deep integration with other Google products, like YouTube, Google Play Music, Nest, and Google Assistant. Through voice commands, users can play music; ask natural language questions; receive sports, news, and finance updates; call an Uber; and make appointments and reminders. According to market research firm Consumer Intelligence Research Partners, Amazon has sold over 5 million Echo devices as of November 2016; however, a month later Amazon’s press release boasted a 9x increase in Echo family sales over the previous year’s holiday sales, suggesting that 5 million sold is a significant underestimate. AI-assistants, while still not used by the majority of Americans, are rapidly spilling over into the mainstream.
Facebook CEO Mark Zuckerberg showed what’s currently possible by spending a year building Jarvis, an imitation of the super-intelligent AI assistant in Robert Downey Jr.’s Iron Man films. In a Facebook post, he outlines connecting the myriad of home devices to one network; teaching Jarvis his preferences so it could play music and recognize friends at the door and let them in; building a Facebook messenger bot for Jarvis to issue text commands; and creating an iOS speech recognition app to issue voice commands.
The primary limitation for Zuckerberg, a billionaire with daily access to the world’s best engineers, was not technology, but rather having devices that could easily communicate with each other and Jarvis in a central, unified system. This suggests that if Google or Amazon is successful in integrating their smart speakers with many other home devices (or proprietary versions), that Jarvis-like home AI would be available to anyone in the next five years.
We’ve only scratched the surface of examples of AI and ML in day-to-day life. Specific industries and hobbies have habitual interaction with AI far beyond what’s explored in this article. For example, casual chess players regularly use AI powered chess engines to analyze their games and practice tactics, and bloggers often use mailing-list services that use ML to optimize reader engagement and open-rates.
How will AI affect daily life on a grand scale in the near future? Futurist and Wired magazine co-founder Kevin Kelly predicts that, as AI becomes more deeply integrated in our lives, it will become the new infrastructure powering a second industrial revolution.
Further reading for current and future uses of AI:
- Unseen Ways AI is Making the World a Better Place
- Three Scenarios for the Future of Work in the AI Economy
- How to Apply Machine Learning to Business Problems
- Artificial Intelligence Industry — An Overview by Segment
Artificial Intelligence (AI)
Artificial Intelligence (AI)
Definition – What does Artificial Intelligence (AI) mean?
Artificial intelligence (AI) is an area of computer science that emphasizes the creation of intelligent machines that work and react like humans. Some of the activities computers with artificial intelligence are designed for include:
- Speech recognition
- Problem solving
Techopedia explains Artificial Intelligence (AI)
Artificial intelligence is a branch of computer science that aims to create intelligent machines. It has become an essential part of the technologyindustry.
Research associated with artificial intelligence is highly technical and specialized. The core problems of artificial intelligence include programming computers for certain traits such as:
- Problem solving
- Ability to manipulate and move objects
Knowledge engineering is a core part of AI research. Machines can often act and react like humans only if they have abundant information relating to the world. Artificial intelligence must have access to objects, categories, properties and relations between all of them to implement knowledge engineering. Initiating common sense, reasoning and problem-solving power in machines is a difficult and tedious task.
Machine learning is also a core part of AI. Learning without any kind of supervision requires an ability to identify patterns in streams of inputs, whereas learning with adequate supervision involves classification and numerical regressions. Classification determines the category an object belongs to and regression deals with obtaining a set of numerical input or output examples, thereby discovering functions enabling the generation of suitable outputs from respective inputs. Mathematical analysis of machine learning algorithms and their performance is a well-defined branch of theoretical computer science often referred to as computational learning theory.
Machine perception deals with the capability to use sensory inputs to deduce the different aspects of the world, while computer vision is the power to analyze visual inputs with a few sub-problems such as facial, object and gesture recognition.
Robotics is also a major field related to AI. Robots require intelligence to handle tasks such as object manipulation and navigation, along with sub-problems of localization, motion planning and mapping.
What Are Bitcoins? How Do Bitcoins Work?
What Are Bitcoins? How Do Bitcoins Work?
Cryptocurrencies are just lines of computer code that hold monetary value. Those lines of code are created by electricity and high-performance computers. Cryptocurrency is also known as digital currency. Either way, it is a form of digital public money that is created by painstaking mathematical computations and policed by millions of computer users called miners. Physically, there is nothing to hold although you can exchange crypto for cash.
Crypto comes from the word cryptography, the security process used to protect transactions that send the lines of code for purchases. Cryptography also controls the creation of new coins, the term used to describe specific amounts of code. Hundreds of coin types now dot the crypto markets; only a handful have the potential to become a viable investment.
Governments have no control over the creation of cryptocurrencies, which is what initially made them so popular. Most cryptocurrencies begin with a market cap in mind, which means that their production will decrease over time thus, ideally, making any particular coin more valuable in the future.
What Are Bitcoins?
Bitcoin was the first popular cryptocoin. No one knows exactly who created it — most cryptocurrencies are designed for maximum anonymity — but bitcoins first appeared in 2009 from a developer supposedly named Satoshi Nakamoto. He has since disappeared and left behind a Bitcoin fortune.
One of the advantages of bitcoin is that it can be stored offline on a person’s local hardware. That process is called cold storage and it protects the currency from being taken by others. When the currency is stored on the Internet somewhere (hot storage), there is high risk of it being stolen.
Why Bitcoins Are So Controversial
Various recent events turned bitcoin into a media sensation.
Scams, too, are very real in the cryptocurrency world. Naive and savvy investors alike can lose hundreds or thousands of dollars to scams.
How Bitcoins Work
Bitcoins are completely virtual coins designed to be self-contained for their value, with no need for banks to move and store the money. Once you own bitcoins, they behave like physical gold coins: They possess value and trade just as if they were nuggets of gold in your pocket. You can use your bitcoins to purchase goods and services online, or you can tuck them away and hope that their value increases over the years.
Bitcoins are traded from one personal wallet to another. A wallet is a small personal database that you store on your computer drive (i.e cold storage), on your smartphone, on your tablet or somewhere in the cloud (hot storage).
Bitcoin Values and Regulations
A single bitcoin varies in value daily; check places like Coindesk to check current par rates. There are more than $2 billion dollars worth of bitcoins in existence. Bitcoins will stop being created when the total number reaches 21 billion coins, which will be sometime around the year 2040. As of 2017, more than half of those bitcoins had been created.
Bitcoin currency is completely unregulated and completely decentralized. There is no national bank or national mint, and there is no depositor insurance coverage. The currency itself is self-contained and un-collateraled, meaning that there is no precious metal behind the bitcoins; the value of each bitcoin resides within each bitcoin itself.
Bitcoins are stewarded by miners, the massive network of people who contribute their personal computers to the bitcoin network. Miners act as a swarm of ledger keepers and auditors for bitcoin transactions. Miners are paid for their accounting work by earning new bitcoins for each week they contribute to the network.
How Bitcoins Are Tracked
A bitcoin holds a very simple data ledger file called a blockchain. Each blockchain is unique to each individual user and his or her personal Bitcoin wallet.
So, although people cannot easily see your personal identity, they can see the history of your bitcoin wallet. This is a good thing, as a public history adds transparency and security, and helps deter people from using bitcoins for dubious or illegal purposes.
Banking or Other Fees to Use Bitcoins
There are very small fees to use bitcoins. However, there are no ongoing banking fees with bitcoin and other cryptocurrencies because there are no banks involved. Instead, you pay small fees to three groups of bitcoin services: the servers (nodes) who support the network of miners, the online exchanges that convert your bitcoins into dollars, and the mining pools you join.
The owners of some server nodes will charge one-time transaction fees of a few cents every time you send money across their nodes, and online exchanges will similarly charge when you cash your bitcoins in for dollars or euros. Additionally, most mining pools will either charge a small 1 percent support fee or ask for a small donation from the people who join their pools.
In the end, while there are nominal costs to use bitcoin, the transaction fees and mining pool donations are much cheaper than conventional banking or wire transfer fees.
Bitcoin Production Facts
Bitcoin mining involves commanding your home computer to work around the clock to solve “proof-of-work” problems (computationally intensive math problems). Each bitcoin math problem has a set of possible 64-digit solutions. Your desktop computer, if it works nonstop, might be able to solve one bitcoin problem in two to three days — likely longer.
For a single personal computer mining bitcoins, you may earn perhaps 50 cents to 75 cents USD per day, minus your electricity costs. For a large-scale miner who runs 36 powerful computers simultaneously, that person can earn up to $500 per day, after costs.
Just like holding a bag of gold coins, a person who takes reasonable precautions will be safe from having their personal bitcoin cache stolen by hackers.
More than hacker intrusion, the real loss risk with bitcoins revolves around not backing up your wallet with a failsafe copy. There is an important .dat file that is updated every time you receive or send bitcoins, so this .dat file should be copied and stored as a duplicate backup every day you do bitcoin transactions.
The collapse of the Mt. Gox bitcoin exchange service was not due to any weakness in the bitcoin system. Rather, that organization collapsed because of mismanagement and the company’s unwillingness to invest in security measures. Mt. Gox, for all intents and purposes, had a large bank with no security guards and it paid the price.
Abuse of Bitcoins
There are currently three known ways that bitcoin currency can be abused.
1) Technical weakness — time delay in confirmation: Bitcoins can be double-spent in some rare instances during the confirmation interval. Because bitcoins travel peer-to-peer, it takes several seconds for a transaction to be confirmed across the P2P swarm of computers. During these few seconds, a dishonest person who employs fast clicking can submit a second payment of the same bitcoins to a different recipient.
While the system will eventually catch the double-spending and negate the dishonest second transaction, if the second recipient transfers goods to the dishonest buyer before they receive confirmation, then that second recipient will lose both the payment and the goods.
2) Human dishonesty — pool organizers taking unfair share slices: Because bitcoin mining is best achieved through pooling (joining a group of thousands of other miners), the organizers of each pool get the privilege of choosing how to divide up any bitcoins that are discovered. Bitcoin mining pool organizers can dishonestly take more bitcoin mining shares for themselves.
3) Human mismanagement — online exchanges: With Mt. Gox being the biggest example, the people running unregulated online exchanges that trade cash for bitcoins can be dishonest or incompetent. This is the same as Fannie Mae and Freddie Mac investment banks going under because of human dishonesty and incompetence. The only difference is that conventional banking losses are partially insured for the bank users, while bitcoin exchanges have no insurance coverage for users.
Three Reasons Why Bitcoins Are Such a Big Deal
There is a lot of controversy around bitcoins.
What is Blockchain Technology? A Step-by-Step Guide For Beginners
What is Blockchain Technology? A Step-by-Step Guide For Beginners
Is blockchain technology the new internet?
The blockchain is an undeniably ingenious invention – the brainchild of a person or group of people known by the pseudonym, Satoshi Nakamoto. But since then, it has evolved into something greater, and the main question every single person is asking is: What is Blockchain?
By allowing digital information to be distributed but not copied, blockchain technology created the backbone of a new type of internet. Originally devised for the digital currency, Bitcoin, (Buy Bitcoin) the tech community is now finding other potential uses for the technology.
Bitcoin has been called “digital gold,” and for a good reason. To date, the total value of the currency is close to $112 billion US. And blockchains can make other types of digital value. Like the internet (or your car), you don’t need to know how the blockchain works to use it. However, having a basic knowledge of this new technology shows why it’s considered revolutionary. So, we hope you enjoy this, What Is Blockchain Guide. And if you already know what blockchain is and want to become a blockchain developer (2018 – currently in high demand!) please check out our in-depth blockchain tutorial and create your very first blockchain.
How Does Blockchain Work?
Picture a spreadsheet that is duplicated thousands of times across a network of computers. Then imagine that this network is designed to regularly update this spreadsheet and you have a basic understanding of the blockchain.
Information held on a blockchain exists as a shared — and continually reconciled — database. This is a way of using the network that has obvious benefits. The blockchain database isn’t stored in any single location, meaning the records it keeps are truly public and easily verifiable. No centralized version of this information exists for a hacker to corrupt. Hosted by millions of computers simultaneously, its data is accessible to anyone on the internet.
To go in deeper with the Google spreadsheet analogy, I would like you to read this piece from a blockchain specialist.
Blockchain Durability and robustness
Blockchain technology is like the internet in that it has a built-in robustness. By storing blocks of information that are identical across its network, the blockchain cannot:
- Be controlled by any single entity.
- Has no single point of failure.
Bitcoin was invented in 2008. Since that time, the Bitcoin blockchain has operated without significant disruption. (To date, any of problems associated with Bitcoin have been due to hacking or mismanagement. In other words, these problems come from bad intention and human error, not flaws in the underlying concepts.)
The internet itself has proven to be durable for almost 30 years. It’s a track record that bodes well for blockchain technology as it continues to be developed.
Transparent and incorruptible
The blockchain network lives in a state of consensus, one that automatically checks in with itself every ten minutes. A kind of self-auditing ecosystem of a digital value, the network reconciles every transaction that happens in ten-minute intervals. Each group of these transactions is referred to as a “block”. Two important properties result from this:
- Transparency data is embedded within the network as a whole, by definition it is public.
- It cannot be corrupted altering any unit of information on the blockchain would mean using a huge amount of computing power to override the entire network.
In theory, this could be possible. In practice, it’s unlikely to happen. Taking control of the system to capture Bitcoins, for instance, would also have the effect of destroying their value.
A network of nodes
A network of so-called computing “nodes” make up the blockchain.
Together they create a powerful second-level network, a wholly different vision for how the internet can function.
Every node is an “administrator” of the blockchain, and joins the network voluntarily (in this sense, the network is decentralized). However, each one has an incentive for participating in the network: the chance of winning Bitcoins.
Nodes are said to be “mining” Bitcoin, but the term is something of a misnomer. In fact, each one is competing to win Bitcoins by solving computational puzzles. Bitcoin was the raison d’etre of the blockchain as it was originally conceived. It’s now recognized to be only the first of many potential applications of the technology.
There are an estimated 1600 Bitcoin-like cryptocurrencies (exchangeable value tokens) already available. As well, a range of other potential adaptations of the original blockchain concept are currently active, or in development.
The idea of decentralization
By design, the blockchain is a decentralized technology.
Anything that happens on it is a function of the network as a whole. Some important implications stem from this. By creating a new way to verify transactions aspects of traditional commerce could become unnecessary. Stock market trades become almost simultaneous on the blockchain, for instance — or it could make types of record keeping, like a land registry, fully public. And decentralization is already a reality.
A global network of computers uses blockchain technology to jointly manage the database that records Bitcoin transactions. That is, Bitcoin is managed by its network, and not any one central authority. Decentralization means the network operates on a user-to-user (or peer-to-peer) basis. The forms of mass collaboration this makes possible are just beginning to be investigated.
Who will use the blockchain?
As web infrastructure, you don’t need to know about the blockchain for it to be useful in your life.
Currently, finance offers the strongest use cases for the technology. International remittances, for instance. The World Bank estimates that over $430 billion US in money transfers were sent in 2015. And at the moment there is a high demand for blockchain developers.
=The blockchain potentially cuts out the middleman for these types of transactions. Personal computing became accessible to the general public with the invention of the Graphical User Interface (GUI), which took the form of a “desktop”. Similarly, the most common GUI devised for the blockchain are the so-called “wallet” applications, which people use to buy things with Bitcoin, and store it along with other cryptocurrencies.
Transactions online are closely connected to the processes of identity verification. It is easy to imagine that wallet apps will transform in the coming years to include other types of identity management.
The Blockchain & Enhanced security
By storing data across its network, the blockchain eliminates the risks that come with data being held centrally.
=Its network lacks centralized points of vulnerability that computer hackers can exploit. Today’s internet has security problems that are familiar to everyone. We all rely on the “username/password” system to protect our identity and assets online. Blockchain security methods use encryption technology.
The basis for this are the so-called public and private “keys”. A “public key” (a long, randomly-generated string of numbers) is a users’ address on the blockchain. Bitcoins sent across the network gets recorded as belonging to that address. The “private key” is like a password that gives its owner access to their Bitcoin or other digital assets. Store your data on the blockchain and it is incorruptible. This is true, although protecting your digital assets will also require safeguarding of your private key by printing it out, creating what’s referred to as a paper wallet.
A second-level network
With blockchain technology, the web gains a new layer of functionality.
Already, users can transact directly with one another — Bitcoin transactions in 2017 averaged around $2 billion US per day. With the added security brought by the blockchain new internet business are on track to unbundle the traditional institutions of finance.
Goldman Sachs believes that blockchain technology holds great potential especially to optimize clearing and settlements, and could represent global savings of up to $6bn per year.
The Blockchain a New Web 3.0?
Indeed.com, one of the biggest job portals in the world, published some interesting statistics regarding the rise of Blockchain jobs. It looks like the number of blockchain jobs increased from December 2016 to December 2017 by a staggering 207%. But that’s not the end of it. According to the stats, this number has increased by, a scarcely believable 631% since November 2015.
The blockchain gives internet users the ability to create value and authenticates digital information. What will new business applications result?
Distributed ledgers enable the coding of simple contracts that will execute when specified conditions are met. Ethereum is an open source blockchain project that was built specifically to realize this possibility. Still, in its early stages, Ethereum has the potential to leverage the usefulness of blockchains on a truly world-changing scale.
At the technology’s current level of development, smart contracts can be programmed to perform simple functions. For instance, a derivative could be paid out when a financial instrument meets certain benchmark, with the use of blockchain technology and Bitcoin enabling the payout to be automated.
The sharing economy
With companies like Uber and AirBnB flourishing, the sharing economy is already a proven success. Currently, however, users who want to hail a ride-sharing service have to rely on an intermediary like Uber. By enabling peer-to-peer payments, the blockchain opens the door to direct interaction between parties — a truly decentralized sharing economy results.
An early example, OpenBazaar uses the blockchain to create a peer-to-peer eBay. Download the app onto your computing device, and you can transact with OpenBazzar vendors without paying transaction fees. The “no rules” ethos of the protocol means that personal reputation will be even more important to business interactions than it currently is on eBay.
Crowdfunding initiatives like Kickstarter and Gofundme are doing the advance work for the emerging peer-to-peer economy. The popularity of these sites suggests people want to have a direct say in product development. Blockchains take this interest to the next level, potentially creating crowd-sourced venture capital funds.
In 2016, one such experiment, the Ethereum-based DAO (Decentralized Autonomous Organization), raised an astonishing $200 million USD in just over two months. Participants purchased “DAO tokens” allowing them to vote on smart contract venture capital investments (voting power was proportionate to the number of DAO they were holding). A subsequent hack of project funds proved that the project was launched without proper due diligence, with disastrous consequences. Regardless, the DAO experiment suggests the blockchain has the potential to usher in “a new paradigm of economic cooperation.”
By making the results fully transparent and publicly accessible, distributed database technology could bring full transparency to elections or any other kind of poll taking. Ethereum-based smart contracts help to automate the process.
The app, Boardroom, enables organizational decision-making to happen on the blockchain. In practice, this means company governance becomes fully transparent and verifiable when managing digital assets, equity or information.
Supply chain auditing
Consumers increasingly want to know that the ethical claims companies make about their products are real. Distributed ledgers provide an easy way to certify that the backstories of the things we buy are genuine. Transparency comes with blockchain-based timestamping of a date and location — on ethical diamonds, for instance — that corresponds to a product number.
The UK-based Provenance offers supply chain auditing for a range of consumer goods. Making use of the Ethereum blockchain, a Provenance pilot project ensures that fish sold in Sushi restaurants in Japan has been sustainably harvested by its suppliers in Indonesia.
Decentralizing file storage on the internet brings clear benefits. Distributing data throughout the network protects files from getting hacked or lost.
Inter Planetary File System (IPFS) makes it easy to conceptualize how a distributed web might operate. Similar to the way a bittorrent moves data around the internet, IPFS gets rid of the need for centralized client-server relationships (i.e., the current web). An internet made up of completely decentralized websites has the potential to speed up file transfer and streaming times. Such an improvement is not only convenient. It’s a necessary upgrade to the web’s currently overloaded content-delivery systems.
The crowdsourcing of predictions on event probability is proven to have a high degree of accuracy. Averaging opinions cancels out the unexamined biases that distort judgment. Prediction markets that payout according to event outcomes are already active. Blockchains are a “wisdom of the crowd” technology that will no doubt find other applications in the years to come.
The prediction market application Augur makes share offerings on the outcome of real-world events. Participants can earn money by buying into the correct prediction. The more shares purchased in the correct outcome, the higher the payout will be. With a small commitment of funds (less than a dollar), anyone can ask a question, create a market based on a predicted outcome, and collect half of all transaction fees the market generates.
Protection of intellectual property
As is well known, digital information can be infinitely reproduced — and distributed widely thanks to the internet. This has given web users globally a goldmine of free content. However, copyright holders have not been so lucky, losing control over their intellectual property and suffering financially as a consequence. Smart contracts can protect copyright and automate the sale of creative works online, eliminating the risk of file copying and redistribution.
Mycelia uses the blockchain to create a peer-to-peer music distribution system. Founded by the UK singer-songwriter Imogen Heap, Mycelia enables musicians to sell songs directly to audiences, as well as license samples to producers and divvy up royalties to songwriters and musicians — all of these functions being automated by smart contracts. The capacity of blockchains to issue payments in fractional cryptocurrency amounts (micropayments) suggests this use case for the blockchain has a strong chance of success.
Internet of Things (IoT)
What is the IoT? The network-controlled management of certain types of electronic devices — for instance, the monitoring of air temperature in a storage facility. Smart contracts make the automation of remote systems management possible. A combination of software, sensors, and the network facilitates an exchange of data between objects and mechanisms. The result increases system efficiency and improves cost monitoring.
The biggest players in manufacturing, tech and telecommunications are all vying for IoT dominance. Think Samsung, IBM and AT&T. A natural extension of existing infrastructure controlled by incumbents, IoT applications will run the gamut from predictive maintenance of mechanical parts to data analytics, and mass-scale automated systems management.
Blockchain technology enables the buying and selling of the renewable energy generated by neighborhood microgrids. When solar panels make excess energy, Ethereum-based smart contracts automatically redistribute it. Similar types of smart contract automation will have many other applications as the IoT becomes a reality.
Located in Brooklyn, Consensys is one of the foremost companies globally that is developing a range of applications for Ethereum. One project they are partnering on is Transactive Grid, working with the distributed energy outfit, LO3. A prototype project currently up and running uses Ethereum smart contracts to automate the monitoring and redistribution of microgrid energy. This so-called “intelligent grid” is an early example of IoT functionality.
There is a definite need for better identity management on the web. The ability to verify your identity is the lynchpin of financial transactions that happen online. However, remedies for the security risks that come with web commerce are imperfect at best. Distributed ledgers offer enhanced methods for proving who you are, along with the possibility to digitize personal documents. Having a secure identity will also be important for online interactions — for instance, in the sharing economy. A good reputation, after all, is the most important condition for conducting transactions online.
Developing digital identity standards is proving to be a highly complex process. Technical challenges aside, a universal online identity solution requires cooperation between private entities and government. Add to that the need to navigate legal systems in different countries and the problem becomes exponentially difficult. E-Commerce on the internet currently relies on the SSL certificate (the little green lock) for secure transactions on the web. Netki is a startup that aspires to create an SSL standard for the blockchain. Having recently announced a $3.5 million seed round, Netki expects a product launch in early 2017.
AML and KYC
Anti-money laundering (AML) and know your customer (KYC) practices have a strong potential for being adapted to the blockchain. Currently, financial institutions must perform a labour intensive multi-step process for each new customer. KYC costs could be reduced through cross-institution client verification, and at the same time increase monitoring and analysis effectiveness.
Startup Polycoin has an AML/KYC solution that involves analysing transactions. Those transactions identified as being suspicious are forwarded on to compliance officers. Another startup Tradle is developing an application called Trust in Motion (TiM). Characterized as an “Instagram for KYC”, TiM allows customers to take a snapshot of key documents (passport, utility bill, etc.). Once verified by the bank, this data is cryptographically stored on the blockchain.
Today, in exchange for their personal data people can use social media platforms like Facebook for free. In future, users will have the ability to manage and sell the data their online activity generates. Because it can be easily distributed in small fractional amounts, Bitcoin — or something like it — will most likely be the currency that gets used for this type of transaction.
The MIT project Enigma understands that user privacy is the key precondition for creating of a personal data marketplace. Enigma uses cryptographic techniques to allow individual data sets to be split between nodes, and at the same time run bulk computations over the data group as a whole. Fragmenting the data also makes Enigma scalable (unlike those blockchain solutions where data gets replicated on every node). A Beta launch is promised within the next six months.
Land title registration
As Publicly-accessible ledgers, blockchains can make all kinds of record-keeping more efficient. Property titles are a case in point. They tend to be susceptible to fraud, as well as costly and labour intensive to administer.
A number of countries are undertaking blockchain-based land registry projects. Honduras was the first government to announce such an initiative in 2015, although the current status of that project is unclear. This year, the Republic of Georgia cemented a deal with the Bitfury Group to develop a blockchain system for property titles. Reportedly, Hernando de Soto, the high-profile economist and property rights advocate, will be advising on the project. Most recently, Sweden announced it was experimenting with a blockchain application for property titles.
The potential for added efficiency in share settlement makes a strong use case for blockchains in stock trading. When executed peer-to-peer, trade confirmations become almost instantaneous (as opposed to taking three days for clearance). Potentially, this means intermediaries — such as the clearing house, auditors and custodians — get removed from the process.
Numerous stock and commodities exchanges are prototyping blockchain applications for the services they offer, including the ASX (Australian Securities Exchange), the Deutsche Börse (Frankfurt’s stock exchange) and the JPX (Japan Exchange Group). Most high profile because the acknowledged first mover in the area, is the Nasdaq’s Linq, a platform for private market trading (typically between pre-IPO startups and investors). A partnership with the blockchain tech company Chain, Linq announced the completion of it its first share trade in 2015. More recently, Nasdaq announced the development of a trial blockchain project for proxy voting on the Estonian Stock Market.
Information Technology in Business: The big picture
Information Technology in Business: The big picture
Computers and information systems are essential parts of every business today. Like accounting and legal, every business needs to invest in technology to compete. Technology is both a cost of doing business, and an opportunity to do more business. Most people I talk with recognize the necessity of having a computer, an email address, and a web site, but still look at the upfront cost more than other issues.
After spending some time working with dozens of businesses, I think it’s time to take a step back and look at the big picture of technology in business. Let’s take a reporter’s view of the topic, and ask the basic questions: who, what, where, why, when, and how much?
For today, we’ll keep this short, but each of these questions deserves a more complete article in the future.
What are the benefits of technology for a business? There are many, but most fall under a few categories:
- Reach more potential customers, develop a business relationship with potential customers
- Streamline operations, reduce costs, improve efficiency, maximize profit, minimize waste, devote talent to core business instead of overhead
- Provide better service to customers
- Support better relationships with key partners
- Allow customers to better guide the business
The very first question businesses should ask before spending any money or time on technology is, “why am I doing this?” If there is not a core business benefit to be gained, why do it in the first place?
Established businesses outside the technology industry typically spend between ½ percent and 10 percent of their annual revenue on technology spending, depending mostly on the industry. Manufacturing and retail are typically at the low end of this range, while finance and health care are typically at the high end.
If you’re at the low end of technology spending for your industry, you may be missing out on some key benefits technology can provide. If you’re at the high end, you may be spending more than you need to on proprietary solutions, or you may be leading your industry with some strategic investment.
What costs do you need to consider as part of your technology budget? These break down into several categories:
- Initial cost—hardware and software, and training
- Ongoing cost—maintaining systems, including licenses for proprietary software, hosting, and support
- Upgrade cost—cost of upgrades, and expected lifespan of systems/frequency of upgrades
- Value proposition—how much employee time will the system save? How much new business could the system generate?
- Opportunity cost—how much potential revenue is lost by not implementing a system? What are your competitors doing in this area?
- Risk—what are the risks of a particular system? What does it cost to mitigate those risks?
Should you spend most of your technology budget on infrastructure, hosted applications, custom line-of-business applications, or what? The answer to this depends a lot on your industry, but even more on your specific business. Generally, most businesses spend around half of their technology budget on infrastructure—computers, networking equipment, and Internet Service Providers (ISPs). As the world moves more and more online, and open source software becomes more compelling, there are huge opportunities for savings in these areas, for businesses that can take advantage of them.
There’s a fine line between too much and not enough. Spend too much on technology, and it will consume your time and budget, leaving you ill prepared to do anything else on your business. Spend too little and your competition may improve their business to the point that you can’t compete.
You need to implement enough technology to see a real benefit, prevent the worst disasters, and not miss out on any major opportunities, while not spending more than you can handle.
Technology has a cost not just in dollars, but also in the time you and your employees need to spend adapting to it. Bite off too big a chunk and technology becomes counter-productive. Nearly always, small, incremental, ongoing chunks are a better way to bring technology into your business than large all-or-nothing systems that promise to do everything right away.
Finally, you need to decide who to help you implement technology in your business. Will you do it yourself? Do you purchase an off-the shelf product? Do you use free software? Do you hire a programmer to create a custom system? Do you use a hosted system? Do you hire a consultant to help?
Obviously, as an open source consultant, I think the answer is usually hire a good consultant to help you use as much quality free software in your business as possible. Whether or not to use a hosted system depends on your specific business needs. Off-the-shelf proprietary products are quickly becoming the least favorable way to go, but there are still a few niches where there isn’t a viable alternative.
Many businesses are stuck at a tactical level, trying to stay ahead on cash flow and payroll, and don’t have time to think about technology in a strategic way. But a strong plan for technology should be a part of every business plan, and re-evaluated whenever taking a strategic look at a business.
If you need assistance answering these questions in your business, Freelock would be happy to help.
Freelock is proud to be sponsoring a business computing lab at LinuxFest Northwest this year. Come to the event for an opportunity to try out a bunch of different open source business applications for managing your web site, your business finances, projects, customers, and knowledge. I am giving a presentation at the event as an introduction and demonstration of an entire set of systems to run a business, all using free software. Freelock will also have a booth at the event, and all of our employees will be there.
The event is held every year the last weekend of April. This year it is April 26 and 27 at the Bellingham Technical College, and admission is free. See http://www.linuxfestnorthwest.org for more details.
Congratulations to our client, HomeSavvi.com, who officially launched their web site on March 11. http://www.HomeSavvi.com is a community web site for people to learn, share, and help each other with home remodeling projects. Freelock has managed their servers since the company started, and provides system administration advice and support for this web startup company which recently received funding.
Ticket-Tee is planning a big motorcycle rally/independent music festival this summer in Missoula, Montana this June, called the Snake Pit Music and Motorcycle Connection. Go here to register for the event. Ticket-Tee is a unique business that creates t-shirts and other items that act as tickets to an event. Freelock provided custom programming services to hook up the Ticket-Tee site to their merchant account, as well as track referrals so that bands and clubs get a referral commission for their fans who purchase tickets for an event.
Freelock is growing
Erik Olson joined the Freelock team in February, bringing our in-house team to six. (And yes, I mean literally in-house—it’s getting quite crowded here!) Erik previously helped his wife run an Italian wine import business, and brings some much-needed project management skills to the company. Please show him a warm welcome.
Suzanne Murdock, from our partner company Dodd and Associates, is now handling our invoicing—you’ll see most of our bills coming from her. Carl Symons, Daniel Seirawan, and Chris Longmoon are helping us get the word out to businesses that could use our services.
With all this help running the company, I’m starting to do more work on client projects again, and will be taking a stronger lead particularly on the development team. Expect to see us growing more, and getting better office space, soon!
Internet of Things has major potential in Vietnam
Internet of Things has major potential in Vietnam
Managers of IT enterprises believe that the potential of the Internet of Things (IoT) in Vietnam is huge but it is nonetheless essential to have more supportive policies from the government, the 11th Asia IoT Business Platform 2016 held in Hanoi on November 29 and 30 heard. This was the first time the conference has been held in Vietnam.
Mr. Zaf Coelho, Project Director of the Asia IoT Business Platform, is optimistic about the growth of IoT in Vietnam, citing gaps in the market as areas that IoT players can fill.
He also insisted that there is no better time to develop IoT solutions in Vietnam to foster socioeconomic development and increase the country’s competitive advantages.
The conference highlighted the value and economic impact of IoT, including improving private and public sector enterprise productivity, harnessing ICT to alleviate problems in dense cities, and powering Smart Cities in Vietnam.
The panel discussions featured key topics relating to the larger IoT landscape, including domestic policies to support telecommunications and IoT infrastructure and the challenges in developing a Smart City in a developing country like Vietnam.
There have been many opportunities for IoT growth in Vietnam recently, according to internal research by Industry Platform, an organizer of the Asia IoT Business Platform. With IoT long been considered an important aspect of daily life, many major projects are being rolled out to leverage IoT technologies to improve quality of life.
Evaluating the position of IoT, Vice President of VNPT Vinaphone Pham Anh Tuan told VET that “IoT changes the way we live.”
“Since IoT first appeared in Vietnam, in 2008, there are now about 300,000 such products,” he added.
The Asia IoT Business Platform conference gathered IoT leaders who shared case studies and insights on the use of IoT technologies across different sectors, such as leveraging data analytics to determine traffic patterns and accelerating digital transformation via cloud technology, agriculture, and banking, finance and commerce.
“The Asia IoT Business Platform is an important industry event that serves as a platform in bridging local and international enterprises and companies to network, find new partners, grow market share, increase business opportunities, and strengthen international competitiveness,” said Deputy Minister of Science and Technology Pham Dai Duong.
This is evident in Smart City projects in Vietnam that started out as early as 2008. Since then, five more cities in the country have been pursuing their own Smart City projects, including Hanoi, Da Nang, and Hai Phuong. The recent launch of the Hoa Lac IoT Lab further indicates the country’s ambition in developing the country’s IoT ecosystem.
Vietnam, one of the fastest-growing economies in the world, is rapidly growing its IT sector to realize its goal of becoming an industrialized country by 2020. The government has pledged to invest $111.6 million from the State budget in the ICT sector by 2020, incentivizing local and international firms to invest in the country.
Vietnamese enterprises usually apply IT in manufacturing or management. The application of IoT is still limited but in the future it will be useful in resolving problems in different sectors, according to Mr. Duong Ton Bao, Deputy Head of the Division of IT Application in Enterprises at the Ministry of Information and Communication.
INTERNET OF THINGS (IoT)
INTERNET OF THINGS (IoT)
- Real-time health monitor
- Smart home controller
- Mobile asset tracking system
- Environment monitor and control
- IoT marketplace
- Vehicle management system
- M2M service creation
- Home controller apps
- Equipment control
- Light therapy device control
- Smart devices, asset tracking, smart agriculture, device testing, smart home, remote control.
- Mobile, sensor, GPS, light therapy device, Beacon, electronic plug
- ARM, Ar7240, RT-Linux, Android
- LoRa, BLE, Beacon, RF, SimpliciTI, MQTT, 6LowPan, RFID, Zigbee, WIFI, 3G, GSM, GPRS, TCP/IP, SNMP
USE OF INFORMATION TECHNOLOGY
USE OF INFORMATION TECHNOLOGY
What Is Information Technology (IT)
Information Technology is any computer-based tool used to work with information and support the information -processing need of an organization. These tools include; computers, software, routers, servers, printers, modems, just to name a few.
Businesses will increase their efficiency when they embrace Information Technology. The primary goal of a business is to serve its customers. In this age of information technology, a business can gain the competitive advantage when it uses information technology to its maximum. Today perfect service is only possible if a business has the right information in the hands of the right people at the right time and this can only occur through the appropriate use of information technology. This means that the challenge facing any business is to plan for, develop, manage, and use its three most important resources, i.e., information, information technology and people – to provide perfect service to its customers. Below I have listed a few uses of information technology both in our society and in business.
USE OF INFORMATION TECHNOLOGY IN BUSINESS
Either small or big business, they will need to scale out a plan to utilize opportunities brought by ‘’Information Technology’’ Businesses use IT in four ways to support (1)information-processing tasks, (2) decision making, (3) shared information through decentralized computing and (4) innovation. Below are detailed points on how business can use Information technology to succeed.
- Supports Information Process Tasks: Businesses are using IT to help basic information-processing tasks. These tasks range from computing and printing payroll checks to creating presentations, to setting up Web sites from which customers can make orders for products or services. During this stage, a business can use IT to create company database applications which can allow employees access information at any given moment. They can also use IT tools to set up networks that enable departments to share information without any hassle or wastage of time.
- Supports Decision Making Tasks: Businesses also use information technology to support decision-making tasks, and this is achieved through (OLAP) online analytical processing. ‘’OLAP’’ is the manipulation of information to support decision making. OLAP can range from performing simple queries on a database to determine which customers have overdue accounts to employing sophisticated artificial intelligence tools such as neural networks and genetic algorithms to solve a complex problem or take advantage of an opportunity. In case, let’s say the ‘’OLAP’’ supports effective decision making. You can also perform OLAP by using databases and data warehouses.
- Supports Shared Information through Decentralized Computing: Decentralize computing is an environment in which an organization splits computing power and locates it in functional business areas as well as on the desktop of knowledge workers. Shared information is an environment in which an organization’s information is organized in one central location, allowing anyone to access and use it as they need to. Today, most businesses have created a decentralized computing structure which brings together the entire spectrum of the business’s information in an orderly fashion so that it can be accessed and used by anyone who needs it. This structure of information is most often a database, which is designed to support the concept of shared information directly.
- Supports Innovation: Information technology tools not only help information-processing tasks, decision-making tasks, and shared information through decentralized computing, but they also enable innovation. Tools like the internet, present us with the opportunity to make research on any subject, the information acquired during the process can be used in the creative design of services or products.
USE OF INFORMATION TECHNOLOGY TO SOCIETY
Society has embraced information technology ‘’IT’’ in various ways. IT has impacted (1) education, (2) communication, (3) job creation, (4) agriculture, and (5)entertainment. Below I have listed specific uses of technology in our society today.
- Online Education: Unlike in the past when education was tied to specific boundaries, now the education sector has changed. With the introduction of online education services, students can learn from anywhere using the internet; this has helped in spreading of essential education materials to all students across the globe. Online education is also being enhanced by the creation of a mobile application which enables students to access education material via their mobile phones.
- Social Networks and mobile phones: Society has used information technology to create technologies which can simplify communication and relationships. Mobile phones have made communication more convenient and social networks like ‘’Facebook.com ‘’ have played a significant role in helping people discover their old friends and create new ones as well. Also, people use online dating platforms to find longtime lovers, sites like Match.com are known for connecting people and these relationships always result in marriage, which is a good deed for the society.
- Job Creation: Today, their so many companies which have been created using information technology, and this has solved the problem of job scarcity to a certain degree. Many of these big IT corporations were started from homes and bedrooms, but now they employee lots of people hence adding value to our society. An excellent example of these companies includes Google, Facebook, Amazon, Dell, Microsoft, Linkedin, Twitter, just to mention but a few.
- Modernized Agriculture: Information technology has also played a prominent role in advancing the agricultural sector. Nowadays a farmer can sell his/her products right from the farm using the internet. All they have to do is set up a website for their products, orders are placed directly via the online site and the farmer will deliver to the client fresh products; this cuts out the middlemen who tend to increase the price of agricultural products with the aim of making profits. In this case, both the farmer and the consumer benefit. The consumer gets the product at a low price when it is still fresh, and the farmer makes more money.
- Modernized Entertainment: The invention of technologies like Ipads, video games, home entertainment system, enhance user life. Music and movies can be accessed online for a small subscription fee. Companies like Netflix and Hulu, have played a significant role in making home entertainment better.
How is Information Technology Used in Different Industries?
How is Information Technology Used in Different Industries?
How is Information Technology Used in Different Industries?
Many industries are realizing that information technology has the potential to give them a competitive edge to increase profit margins and consumer satisfaction. Once only used internally to control costs and increase operations efficiency, information technology is now a major part of industries seeking a competitive edge in providing new channels of service and meeting the growing demands from consumers for information and value. From the simple application of e-mail communications to online teleconferences connecting organizations worldwide, information technology is having a direct impact on how industries conduct business. Using information technology, automatized systems can track all steps of the manufacturing process, predict purchasing habits and patterns, exchange contact information between vendors and customers, and analyze massive amounts of data. As more industries realize the multiple ways the new technology can assist them in providing their products or services efficiently, new information technology jobs will be available. The increase in information technology jobs across every sector of industry is also calling for professionals with specialized knowledge of key industries. More than ninety percent of all information technology personnel are working in business sectors outside of the IT industry. This requires those interested in advancing their information technology careers to pursue technical or professional certification relevant to specific industries or field of interest.
Information Technology in Retail
The internet has been a huge factor in changing the way many retailers do business. Perhaps the greatest impact is the direct line of communication that the internet provides between retailers and consumers. The future of retail business requires retailers to embrace the latest changes in informational technology that enhance how they engage with customers. Informational technology allows retailers to be available day or night interacting with customers how, when, and where they are ready to shop. Within the retail industry, there are information technology careers in data management, software management, business architect, social media specialist, and more. Retailers depend on information technology to manage inventory, track customer-purchasing habits, predict trends, and deliver goods and services. Many retailers understand the potential that mobile communications offer to engage their customers outside the physical retail location by offering real-time discounts, promotional updates, and enhancing options to purchase online or in store. Wireless communication, QR codes, and augmented reality are only a few examples of the changes information technology is bringing to the retail industry.
Information Technology in Academics
Information technology in academics makes it possible for individuals to link faculty with students around the globe. There is and perhaps will, always be debate as to merits of online education over traditional classroom settings. Information technology supports interaction, feedback and experience, with teleconferences, online chats, and other applications such as Skype continue to enhance one-on-one interaction among professors and fellow classmates on or off campus. Traditional education institutions will not prevent the use of information technology from offering educational opportunities. Those institutions that adapt to the changes technology brings and use informational technology to enhance the student experience will be those that continue to thrive. For consumers, the cost of an online education is usually much lower than a traditional campus based program. Online education offers the convenience and adaptability to customize the education process to individual student goals, learning, and schedules. Some experts have referred to the advances that information technology is bringing to academics as a “modern industrial revolution.” Information technology careers in the field of academics will continue to expand as educational programs across the country adapt to these rapid changes.
Information Technology in Health Care
In the past, the health care system has made less use of informational technology than other industries. Possible organizational challenges to the adaptation of health information technology within the health care system include the fragmentation of services, redundant services by multiple providers, and payment system barriers. These are a few obstacles to implementing beneficial changes that health information technology has to offer a system on the verge of unprecedented change. Some health care systems that coordinate care between services and team members are using new health information technology to deliver quality care by integrating health history, screening results, treatment plans, and other complex information from multiple providers. The new health care reform currently under debate will bring the need for widespread adaptation of information technology.
Health information technology can streamline the collection, storage, and retrieval of patient information. Computerized orders from physicians reduce prescription errors, miscommunication, and the time it takes to carry out changes in medications or other orders. Electronic health records now used in house may soon be available to multiple health care providers at the click of a mouse. This opens up concerns for security, privacy rights, and access options. As insurance options enter into the health care databases and imposed parameters brought about by the emerging health care reform bring even more change to the health care system, information technology in health care will continue to evolve to meet new demands. This will create new information technology jobs at multiple levels. Those pursing information technology careers in the health care field will need to purse professional certification and specialization. Combining degrees in health-related fields with those in information technology can provide a distinct advantage over others with only an IT background.