20% percent of business content will be authored by machines.
6000,000,000 six billion connected things will be requesting support.
45% percent of the fastest-growing companies will have fewer employees than instances of smart machines.
3 three million workers globally will be supervised by a “robo-boss.”
CDA customer digital assistant will recognize individuals by face and voice across channels and partners.
2 two million employees will be required to wear health and fitness tracking devices as a condition of employment.
By 2020, Autonomous software agents outside of human control will participate in 5% of all economic transactions.
By 2020, smart agents will facilitate 40 percent of mobile interactions.
In the next five years, innovations in storage, devices, chips, and other hardware will revolutionize IT. Here are 10 emerging hardware technologies CIOs should begin to consider in their strategic roadmaps.
1: Mobile devices with hardened security
Security continues to be a major challenge with mobile devices. One option is Intel’s Software Guard Extension (SGX) technology, which will support the use of secure encrypted memory spaces on commodity CPUs. The goal is to provide applications with an area of secure and protected memory and execution. This could be a boon for mobile devices, a leading source of security breaches that corporate IT must contend with. “We will see the start of a new generation of systems solutions that guarantee security even if the operating system or other infrastructure gets compromised by hackers,” said Sriram Rajamani, Microsoft Research India’s assistant managing director, in an eWeek piece on tech predictions.
2: New chip architectures that improve machine learning performance
As more IoT and machine-based applications enter the IT mainstream, new chip architectures will improve performance over what is presently available with graphical processing units (GPUs). These performance improvements will dramatically improve data transfers and the execution of machine-based learning and analytics.
Unmanned aerial vehicles (UAVs) will continue to push themselves into commercial applications, whether it is delivering packages, taking photographic images, or surveying physical terrain that is difficult to access. They will collect IoT data through sensors and channel it into central communications.
4: Unmanned robots
Robots can carry out simple medical procedures, clean facilities, and pick and pack items in warehouses. The intelligence in these automated machines will be further increased as new technologies come onboard to collect everything that has been learned by all machines into a central data bank that any machine can access.
5: More user-friendly virtual reality gear
Bulky headsets have encumbered VR users and made them dizzy or seasick, prompting companies to avoid VR applications. That’s about to change. For example, Google cardboard provides a small holder for your smart phone and delivers a full-bodied video experience that rivals those produced through older headgear. More comfortable VR headgear will pave the way for greater corporate adoption of VR.
6: New storage technology for greener power grids
Data centers will continue to go green as power companies find better ways to seamlessly blend diverse energy sources, such as solar, wind, and traditional fossil fuel, into a seamless and uninterrupted supply of energy. Today, the use of hybrid energy is difficult because sources like wind and solar can be variable. But better storage can solve this and usher in a new green era that could save data centers and other energy users an estimated $3.4 billion per year.
7: More on-the-job wearables
Jupiter research predicts that smart glasses, smart watches, and a range of motion-sensing devices could improve productivity by 30%. Gartner predicts that by 2018, two million employees, such as law enforcement officers and paramedics, will be required to wear health and fitness tracking devices.
8: Local energy harvesting for Internet access
With automation and a plethora of IoT devices being added to the internet’s workload, new technology is needed to add to bandwidth and ready access. University of Washington researchers have developed technology that enables internet-connected temperature and motion sensors, cameras, etc., to communicate by using only energy harvested from nearby TV, radio, cellphone, and Wi-Fi signals. A principle known as backscattering allows IoT devices to absorb energy emitted by other electronics, enabling them to reduce their internet bandwidth demands. This localized Wi-Fi consumes just 1/10,000th as much power as existing Wi-Fi chipsets.
9: More compact flash memory
3D NAND technology continues to move forward, delivering smaller and more lightweight laptops, tablets, and other devices. Much of this progress is due to the ability of companies like Intel and Micron to stack flash memory cells vertically, which conserves space and enables devices to be smaller and thinner.
10: Nonvolatile memory
With nonvolatile memory, computers can retrieve information even after being turned off and back on. Going forward, we will see new forms of nonvolatile memory that will enable data to be stored at less cost and less power. This will enable smaller devices to store even more data.
Welcome on board
Get the full image
Understanding the work space (people, equipment, software)
Explore the production workflow processes, business model
Familiarize with organization policies and procedures
Determine performance influencers and key indicators
Socialize with board members
Find underestimated unhappy mentors
Avoid undercover agents insiders and gossip tellers
To know the communication deficits and blockbusters
Search for bullies and donkey workers and progress suppressors
Define a plan from where to start and whom to start with
Be effective and efficient
Show self esteem
Find shortcuts and zigzaggers
Capacity sizing and planning
Define Communication channels
Discover failure stories
Keep listening and learn
Evaluate underpinning contracts
Establish direct vendor relationships and SLA
The Internet of Things (IoT) is poised to bring millions of devices online, and as many as a quarter million unique IoT applications will be developed by the year 2020. That means opportunities for skilled developers and technologists will abound. However, there are other, subtler ways the IoT will affect the job market.
“We’re seeing tech companies around the globe getting organized and creating IoT strategies, but where they’re struggling is they don’t have the processes and talent in-house to make these things happen,” says Ryan Johnson, categories director for global freelance marketplace Upwork. By tracking data from Upwork’s database, Johnson and his team have identified major technology skills companies need to drive a successful IoT strategy.
Skills like circuit design, AutoCAD and microcontroller programming will address businesses’ need to adapt circuit design to new form factors and system requirements; design new hardware and add programming and data memory onto microcontrollers, Johnson says.
There’s also great demand for talent skilled in machine learning, algorithm development and data analytics so that companies can develop new ways to gather, analyze and take action based on the data points connected devices create, says Scott Noteboom, CEO of machine learning company LitBit.
There will be a major focus on how interconnected and Internet-enabled devices can communicate with the people who design, develop and build them, Noteboom says, but there will also be an emphasis on how these devices can communicate with each other.
“One of the things we’re looking at is the idea of crowdsourcing, but in this case the ‘crowd’ is made up of all these connected machines,” Notebooms says. How can machines share insights on failure data? Imagine a sensor in a household air conditioning unit that measures refrigerant level, energy usage, heat and other necessary metrics. Not only can it learn that when the refrigerant level gets too low, the unit vibrates differently and produces a higher level of heat, now it can go on the Internet and share that data with all the other relevant AC units, “This is what it looks, feels and sounds like when you’re about to fail!”
“They all set up new alerts so their owners know when to have the refrigerant levels checked? That’s the ‘perfect storm’ before the compressor is going to burn out; the unit’s failing. That’s pretty awesome,” Notebooms says.
The IoT has already demonstrated an increased need for security, because of the potential for increased data exposure as well as device and physical security of connected “things,” says Johnson.
“The added scale and complexity of IoT connectivity, communications and the endpoints themselves complicates things. Within security infrastructure, we’re seeing strong demand on our platform for network security developers and programmers, and people with vulnerability analysis experience to conduct in-depth assessments to identify threats to embedded systems such as local controllers/gateways and determine the risk at the device level,” Johnson says.
The human factor
There’s another, more human way the IoT will affect the job market that’s not often addressed, says Noteboom. With so many connected, communicative devices that could potentially have the ability to take on arduous, repetitive tasks and “drudge work,” the IoT could open up new avenues of creativity and collaboration, he says.
“The IoT has the potential to change the human experience the same way the assembly line and the Industrial revolution did. It changes the human-machine relationship in similar ways; machines will soon be able to do repetitive tasks driven by their past experiences,” he says. That means more time and energy for solving problems by creating technology that can address pollution, save energy, using biotechnology to create new ways to grow crops or generate electrical power through the use of technology, he says.
“If you can use IoT in a data center, for instance, to figure out optimal cooling levels and regulate power consumption, you can help companies save energy without having as many personnel involved. IoT can help reduce the amount of repetitive work, and that will free up people to do more learning, exploring and creating new ideas, new knowledge. Instead of focusing on the accumulation of learning things, we can focus on creating new things that will help our fellow humans,” Noteboom says.
Anyone in charge of the operations of a medical practice understands that there are many tasks that need to be completed on a daily basis. With a healthcare practice, customer interaction is vital. Your patients want to speak with a representative even outside of normal office hours. Because of this, many offices consider utilizing an answering service to provide patients the level of personal contact they require. However, before you trust your important patient information to a medical call center, you want to make certain that the information will be protected. Not only do you want to do this as a service to your patients, but you are responsible for making certain that HIPAA privacy standards are upheld by the call center from which you contract services. Learn more about the different ways a call center can follow these HIPAA guidelines so that you know what to expect when you contract with one of these organizations.
What are the Basic HIPAA Requirements?
The entire set of HIPAA privacy regulations are quite complex and as such outside the scope of a short blog. However, the basic layout of the requirements are as follows:
The goal of HIPAA is to protect the health information of patients. Protected health information relates directly to an individual’s past, current, or future medical care. This can include health care billing and payment information, as well as demographic information.
Patients have the right to decide how their health care information is used. Therefore a patient must sign a release of information before it can be shared outside of the doctor-patient setting. The reasoning behind this is that it will better control how patient medical records are managed.
Once the health care professional has information from a patient, they are required to follow certain guidelines to protect it. Wrongful disclosure or misuse of medical information is prohibited and could subject a medical professional to fines and/or imprisonment.
Because of the stringency of these guidelines, it is vital that the doctor’s office only works with a call center that will keep the information just as secure as the practice itself.
Assessing a Potential Call Center for HIPAA-Compliance
There are a number of ways that the medical call center can keep this patient information confidential. However, look for a call center that – at the very least – follows these simple guidelines:
Hires only screened professionals to work with sensitive data. This can prevent an inside information leak. Ask any prospective answering service provider what sort of screening process they have in place for agents that answer calls on behalf of healthcare providers.
Ensure that the call center has the flexibility to work with the medical practice to develop customized procedures and policies that make sure that your specific needs are met.
Utilize encryption on computers, smartphones and any other devices that house patient information. This can prevent information from leaking to a hacker or in an accidental breach of the computer’s basic security system.
Regularly conduct security assessments. This ensures that the facility does not have any gaps in privacy services.
Has a disaster recovery plan. Should a catastrophic event befall the call center, a properly conceived disaster recovery (DR) plan will ensure that all data pertaining to your business and your patients remains secure and can be restored and retrieved.
Call center management and staff receive on-going HIPAA training. Staying up-to-date on current practices and regulations requires an ongoing dedication to training.
Your medical call center provider is an important business partner that provides a critical service to your business -they handle patient communications so that your staff can focus on patient treatment. Since so much patient information flows through the call center or answering service, it is every bit as important for the call center to be HIPAA-compliant as it is for your internal team to be compliant. You should undertake a thorough review of a potential call center’s practices in order to be confident that your patients’ information will be properly handled.
ينوي قرصان الانترنت الذي قام بسرقة ملايين عناوين البريد الالكتروني وكلمات المرور التابعة لمستخدمي موقع لينكد إن في العام 2012، ببيعها الآن والتي يصل عددها إلى 117 مليون بريد إلكتروني وكلمة مرور.
يتذكر مستخدمي الموقع جيداً حادثة التهكير التي تعرض لها الموقع في العام 2012، والتي سرب من خلال ما وصل إلى 65 مليون كلمة مرور على شبكة الانترنت، بالإضافة إلى سرقة بيانات ملايين العملاء في الموقع.
وبعد أربع سنوات على الحادثة يقوم القرصان المدعو Peace بعرض بيع قاعدة بيانات تخص ما لا يقل عن 167 مليون حساب على لينكد إن، والتي تمكن من كسر 117 مليون كلمة مرور منها.
جديد في “لينكد إن” خلال شهرين فقط
والحل الوحيد هنا هو إن لم تقوموا بتغيير كلمة المرور منذ العام 2012، فعليكم تغييرها الآن وبسرعة وعدم اختيار أي من الكلمات السهلة، فمن الأفضل أن تكون مزيج منوع من الكلمات التي تعني لكم شيء أو حتى كلمات أغنية مفضلة لديكم على أن تحتوي على بعض الرموز.
Here are several promising security proposals that could make a difference in Internet security. None are holistic solutions, but each could make the Internet a safer place, if they could garner enough support.
1. Get real about traffic routing
The Internet Society, an international nonprofit organization focusing on Internet standards, education, and policy, launched an initiative called MANRS, or Mutually Agreed Norms for Routing Security.
Under MANRS, member network operators — primarily Internet service providers — commit to implementing security controls to ensure incorrect router information doesn’t propagate through their networks. The recommendations, based on existing industry best practices, include defining a clear routing policy, enabling source address validation, and deploying antispoofing filters. A “Best Current Operational Practices” document is in the works.
It’s Networking 101: The data packets have to reach their intended destination, but it also matters what path the packets take. If someone in Canada is trying to access Facebook, his or her traffic shouldn’t have to pass through China before reaching Facebook’s servers. Recently, traffic to IP addresses belonging to the U.S. Marine Corps was temporarily diverted through an ISP in Venezuela. If website traffic isn’t secured with HTTPS, these detours wind up exposing details of user activity to anyone along the unexpected path.
Attackers also hide their originating IP addresses with simple routing tricks. The widely implemented User Datagram Protocol (UDP) is particularly vulnerable to source address spoofing, letting attackers send data packets that appear to originate from another IP address. Distributed denial-of-service attacks and other malicious attacks are hard to trace because attackers send requests with spoofed addresses, and the responses go to the spoofed address, not the actual originating address.
When the attacks are against UDP-based servers such as DNS, multicast DNS, the Network Time Protocol, the Simple Server Discovery Protocol, or the Simple Network Management Protocol, the effects are amplified.
Many ISPs are not aware of different attacks that take advantage of common routing problems. While some routing issues can be chalked up to human error, others are direct attacks, and ISPs need to learn how to recognize potential issues and take steps to fix them. “ISPs have to be more responsible about how they are routing traffic,” Webb says. “A lot of them are susceptible to attack.”
ISOC had nine network operators participating in the voluntary program when it launched in 2014; now there are more than 40. For MANRS to make a difference, it needs to expand so that it can influence the market. ISPs that decide not to bother with the security recommendations may find they lose deals because customers will sign with MANRS-compliant providers. Or smaller ISPs may face pressure from larger upstream providers who refuse to carry their traffic unless they can show they’ve implemented appropriate security measures.
It would be great if MANRS became a de facto standard for all ISPs and network providers, but scattered safe neighborhoods are still good enough. “If you require everyone to do it, it is never going to happen,” Webb says.
2. Strengthen digital certificate auditing and monitoring
There have been many attempts to address the issues with SSL, which protects the majority of online communications. SSL helps identify if a website is the site it claims to be, but if someone tricks a certificate authority (CA) into fraudulently issuing digital certificates for a site, then the trust system breaks down.
Back in 2011, an Iranian attacker breached Dutch CA DigiNotar and issued certificates, including ones for Google, Microsoft, and Facebook. The attacker was able to set up man-in-the-middle attacks with those certificates and intercept traffic for the sites. This attack succeeded because the browsers treated the certificate from DigiNotar as valid despite the fact that the sites had certificates signed by a different CA.
Google’s Certificate Transparency project, an open and public framework for monitoring and auditing SSL certificates, is the latest attempt to solve the man-in-the-middle problem.
When a CA issues a certificate, it’s recorded on the public certificate log, and anyone can query for cryptographic proof to verify a particular certificate. Monitors on servers periodically examine the logs for suspicious certificates, including illegitimate certificates issued incorrectly for a domain and those with unusual certificate extensions.
Monitors are similar to credit reporting services, in that they send alerts regarding malicious certificate usage. Auditors make sure the logs are working correctly and verify a particular certificate appears in the log. A certificate not found in the log is a clear signal to browsers that the site is problematic.
With Certificate Transparency, Google hopes to tackle wrongly issued certificates, maliciously acquired certificates, rogue CAs, and other threats. Google certainly has technology on its side, but it has to convince users that this is the right approach.
DNS-based Authentication of Named Entities (DANE) is another attempt to solve the man-in-the-middle problem with SSL. The DANE protocol reinforces the point that a sound technology solution doesn’t automatically win users. DANE pins SSL sessions to the domain name system’s security layer DNSSEC.
While DANE successfully blocks man-in-the-middle attacks against SSL and other protocols, it is haunted by the specter of state surveillance. DANE relies on DNSSEC, and since governments typically owns DNS for top-level domains, there is concern about trusting federal authorities to run the security layer. Adopting DANE means governments would have the kind of access certificate authorities currently wield — and that makes users understandably uneasy.
Despite any misgivings users may have about trusting Google, the company has moved forward with Certificate Transparency. It even recently launched a parallel service, Google Submariner, which lists certificate authorities that are no longer trusted.
3. Tackle the malware problem once and for all
Almost a decade ago Harvard University’s Berkman Center for Internet & Society launched StopBadware, a joint effort with tech companies such as Google, Mozilla, and PayPal to experiment with strategies to combat malicious software.
In 2010 Harvard spun off the project as a stand-alone nonprofit. StopBadware analyzed badware — malware and spyware alike — to provide removal information and to educate users on how to prevent recurring infections. Users and webmasters can look up URLs, IPs, and ASNs, as well as report malicious URLs. Technology companies, independent security researchers, and academic researchers collaborated with StopBadware to share data about different threats.
4. Reinvent the Internet
Then there’s the idea that the Internet should be replaced with a better, more secure alternative.
Crockford also has an answer for SSL’s reliance on certificate authorities: a mutual authentication scheme based on a public key cryptographic scheme. Details are scarce, but the idea depends on searching for and trusting the organization’s public key instead of trusting a specific CA to issue the certificates correctly.
Airline is using a private cloud and open-source software to enable it to analyse social media and understand what consumers think about it
Gulf Air has created a private cloud to support a big data engine that will enable it to monitor consumer sentiment about the airline on social media.
Bahrain’s national carrier is using Red Hat Enterprise Linux, Red Hat JBoss Enterprise Application Platform, and Red Hat Storage as a platform for its Arabic Sentiment Analysis system, which monitors people’s comments through their social media posts.
It processes the posts and provides reports on what customers are saying about Gulf Air.
The open-source software meant no licence fees as the airline was able to run it on its existing infrastructure.
Gulf Air, which has 28 aircraft serving 39 cities in 22 countries, has developed a sentiment analysis engine using big data technologies that can address social media posts in both Arabic and English. It is based on an open-source Hadoop big data framework running across servers in Gulf Air’s private cloud environment.
The private cloud encompasses 200 servers running more than 100 core applications and holds more than 50 terabytes of data.
Gulf Air’s IT team also uses this as the basis for a wider analysis of the state of the market and actions taken by the carrier’s competitors.
As many of you likely already know, Interop opened this week in Las Vegas. What you might not know is that it hosted a one-day Software-Defined Architecture Summit that focused on everything from what is software-defined technology to how to manage the migration from hardware- to software-based technology, and everything in between. If you missed the event, don’t worry. You can still get access to the Summit presentations here.
I spoke to one of the Summit’s presenters, Jack Poller, a lab analyst with the Enterprise Strategy Group, and he had an interesting take on how software-defined everything—software-defined networking, storage and data center—is revolutionizing the IT industry or more aptly, transforming IT from Communism to Capitalism.
The way Poller explains it, computers have evolved over time from systems defined by chips—hardware computing—to something now defined by software; in other words, software computing, which is really just virtualization. He further adds that, “This transformation from hardware to software has played out over and over again; first with computer systems, then with storage and networking. Each time, we’ve evolved from viewing the world as based on our hardware architecture and using that to solve a specific problem to a more general view of using a software architecture to solve our problems.”
As that technology evolution occurs, it flips power and control from one group to another. Poller points out that this flip is something that’s occurred throughout history, a recent example of which can be seen in the music industry.
Years ago, musicians made their money off of albums and performed concerts as a way to generate interest in people buying those albums. Today; however, with modern Internet and free music downloads, recorded music is no longer scarce. Instead, what’s become scarce are live musical performances. So now, musicians use their recorded music as a form of promotion to drive consumers to their concerts, and their concerts are where they make money.
As Poller explains, the technology of the Internet inverted the power structure in the music industry. He sees the same thing happening to the IT industry because of software-defined technology. “Computers used to be these big, very complex, very expensive machines, and because they were such capital- and resource-intensive things you ended up with a centralized resource and centralized control structure, which is essentially what communism is. Now, with software-defined technology, there is still a centralized block of resources, but those resources can be divvied out as necessary.”
In other words, as the evolution to software-defined technology takes hold, IT will transform from being the people in control of the resource and defining what it is, to the people who push the authority, responsibility, control, and decision-making regarding the resource down to the people actually using it; those who are closest to the problem they are trying to solve.
Poller suggests thinking of it this way: “Traditionally, IT says the company needs a storage system, we’re going to invest X amount of dollars in the system and it has to meet the needs of most of the company—even though it doesn’t necessarily meet the particular needs of any one group in the company. With software-defined technology, IT can now get a software-defined storage system and build it exactly to the needs of any group in the company.”
This is a critical point, as in the past, the only way to tailor a storage solution to the needs of a particular group was to buy specific storage units to meet those requirements. With software-defined technology, groups can say I need a chunk of storage that has this response time quality or this level of capacity and IT can control that through software. Furthermore, IT can delegate administrative control to the group who needs that property most so they can use it however they want.
By inverting the power and control, software-defined technology is making IT less like Communism and more like Capitalism. And according to Poller, this is a transformation that’s already begun to take place, although as he points out, “It’s just the beginning of the revolution.”
العاملين في مجال تقنية المعلومات في الصحف والجرائد – الأمريكية بشكل خاص – في خطر داهم بسبب ال Outsourcing في توفير هذه الخدمات عبر القارّات وبكلف لاتكاد تُذكر
For McClatchy Company IT employees who will lose their jobs once their work is moved to India, there are fury and questions.
As many as 150 IT employees at the chain, which runs some 30 newspapers, will be losing their jobs. (See: “Newspaper chain sending IT jobs overseas.”)
A government form, called the Labor Condition Application (LCA), is being posted on bulletin boards at the offices of various newspapers in the chain. This form alerts workers that at least one H-1B worker is being used.
Photographs of some of these notices, posted at the Miami Herald, one of the newspapers owned by McClatchy, were sent to Computerworld.
Wipro labor condition application
The top part of a Labor Condition Application posted at the office of the Miami Herald.
“The are basically firing me and hiring a foreign worker to do my job at less than half the rate they were paying me,” said one IT employee. “They really couldn’t find American workers to do this job? Seriously? I am angry as hell.”
“I feel the same way the Disney employees must have felt last year when this exact same thing happened to them,” said this IT employee.
On the form an employer must indicate whether they are H-1B dependent. If H-1B workers comprise 15% or more of an employer’s workforce, the employer is classified as “H-1B dependent” by the U.S. government and subject to additional requirements.
H-1B dependent firms are required take “good-faith steps to recruit U.S. workers” and not displace workers. But there’s a loophole. If these employers pay more than $60,000 to a visa holder, or that person has a master’s degree, the nondisplacement provisions do not apply.
A second McClatchy IT employee said it’s difficult to understand how an employer can use foreign workers to send their jobs overseas.
“There is something wrong with the system and the laws that allow these kind of things,” said the second IT worker. “I understand that cutting costs is important for a company in deep trouble like McClatchy, but bringing underpaid workers from India to replace American workers is just crossing the line.”
A McClatchy spokeswoman said the firm would not be commenting.
According to IBTimes UK, Sharjah-based InvestBank has been hacked.
A 10GB Zip file has surfaced online which holds sensitive financial data on tens of thousands of InvestBank customers. The information includes folders called ‘Account Master’, ‘Customer Master’ and ‘Branch Master’, which allegedly contains spreadsheets, PDF files and images.
IBTimes UK has reported that one document, titled ‘Cards’ contains almost 20,000 card numbers, and another holds over 3,000 individual bank statements which are watermarked with InvestBank logos. Other files released are ‘Investors’, ‘land documents’ and ‘passports’, with the latter storing scanned ID cards, passports, insurance cards and customer pictures, as well as full passport data of an InvestBank employee.
The news comes after Qatar National Bank confirmed it had been compromised with 1.4GB of sensitive data leaked.
The InvestBank data was uploaded online by a group using the pseudonym ‘Bozkurt Hackers’; many security experts suspect they were also responsible for the QNB breach.
IBTimes UK also said a similar dataset alleging to contain sensitive information from Investbank surfaced last December, as the bank refused to meet the demands from a hacker dubbed ‘Buba’. It is possible that the breach may not be new and could mean the same data has been published but by a separate hacking group.
26th April 2016
It has been reported that the Qatar National Bank has been breached.
The Register reported that documents claiming to be from Qatar National Bank have surfaced on file-sharing site Global-Files.net, but have since been deleted.
According to whistle-blower firm Cryptome‘s Twitter stream, the leaked document contained more than 15,000 documents detailing over 100,000 accounts with passwords and PINs.
Cryptome is claiming to re-host the files, but have not done so yet.
الدوحة أول مايو أيار (رويترز) – قال بنك قطر الوطني أكبر بنك في الشرق الأوسط وأفريقيا من حيث الأصول إنه أخذ خطوات فورية كي لا يتكبد عملاؤه خسائر مالية بعد خرق أمني الأسبوع الماضي أسفر عن كشف البيانات الشخصية لآلاف العملاء.
وقال البنك في بيان اليوم الأحد “نود الإشارة إلى أننا حريصون على اتخاذ كافة الإجراءات اللازمة لحماية بيانات عملائنا ونتعاون مع شركات متخصصة ومستقلة ذات خبرة عالمية لفحص كافة الأنظمة والتأكد من عدم وجود أي ثغرات فيها.” وأضاف “نؤكد مرة أخرى أن جميع حسابات عملائنا آمنة تماما.” لكن لم يتضح كيف يعتزم البنك حماية الحسابات التي نشرت بياناتها بما في ذلك أسماء العملاء وكلمات السر الخاصة بهم.
تضمنت البيانات التي يبلغ حجمها 1.5 جيجابايت وثائق تضم تفاصيل خاصة بالبنك وأرقام الهاتف وتواريخ الميلاد للعديد من صحفيي قناة الجزيرة وأفراد من أسرة آل ثاني الحاكمة ومسؤولين عسكريين.
وتتضمن بعض الملفات صورا لأصحاب الحسابات من موقعي فيسبوك ولينكد إن وهي مسألة قد تكون حساسة في دولة محافظة تثمن الخصوصية.
وقال البنك إن الخرق الأمني يستهدف سمعته لا العملاء وإنه لم يشمل إلا جزءا من عملاء البنك فحسب.
ولم يكشف البيان هوية المتسللين.
وقال البنك إن قسما من البيانات قد يكون دقيقا لكن العديد منها “تم دمجها بمعلومات من مصادر أخرى لا تمت بصلة لمجموعة بنك قطر الوطني بما في ذلك بيانات شخصية من شبكات التواصل الاجتماعي.”
وتضمنت نسخة من المحتوى المسرب اطلعت عليها رويترز بيانات خاصة بصفقات أبرمها عملاء البنك أظهرت أيضا عوائد من الخارج ترجع إلى عمليات آخرها في سبتمبر أيلول 2015.
وتضمن أحد الملفات معلومات عما بدا أنها 465 ألفا و437 حسابا في البنك لكن جزءا ضئيلا فحسب من هذه الحسابات تضمن ما يشبه تفاصيل كاملة عن الحساب.