Archive for ‘Misc’


QNB being hacked تسريب بيانات عملاء بنك قطر الوطني وكلمات السر الخاصة بهم

26th April 2016

It has been reported that the Qatar National Bank has been breached.

The Register reported that documents claiming to be from Qatar National Bank have surfaced on file-sharing site, but have since been deleted.

According to whistle-blower firm Cryptome‘s Twitter stream, the leaked document contained more than 15,000 documents detailing over 100,000 accounts with passwords and PINs.

Cryptome is claiming to re-host the files, but have not done so yet.   

الدوحة أول مايو أيار (رويترز) – قال بنك قطر الوطني أكبر بنك في الشرق الأوسط وأفريقيا من حيث الأصول إنه أخذ خطوات فورية كي لا يتكبد عملاؤه خسائر مالية بعد خرق أمني الأسبوع الماضي أسفر عن كشف البيانات الشخصية لآلاف العملاء.

وقال البنك في بيان اليوم الأحد “نود الإشارة إلى أننا حريصون على اتخاذ كافة الإجراءات اللازمة لحماية بيانات عملائنا ونتعاون مع شركات متخصصة ومستقلة ذات خبرة عالمية لفحص كافة الأنظمة والتأكد من عدم وجود أي ثغرات فيها.” وأضاف “نؤكد مرة أخرى أن جميع حسابات عملائنا آمنة تماما.” لكن لم يتضح كيف يعتزم البنك حماية الحسابات التي نشرت بياناتها بما في ذلك أسماء العملاء وكلمات السر الخاصة بهم.

تضمنت البيانات التي يبلغ حجمها 1.5 جيجابايت وثائق تضم تفاصيل خاصة بالبنك وأرقام الهاتف وتواريخ الميلاد للعديد من صحفيي قناة الجزيرة وأفراد من أسرة آل ثاني الحاكمة ومسؤولين عسكريين.

وتتضمن بعض الملفات صورا لأصحاب الحسابات من موقعي فيسبوك ولينكد إن وهي مسألة قد تكون حساسة في دولة محافظة تثمن الخصوصية.

وقال البنك إن الخرق الأمني يستهدف سمعته لا العملاء وإنه لم يشمل إلا جزءا من عملاء البنك فحسب.

ولم يكشف البيان هوية المتسللين.

وقال البنك إن قسما من البيانات قد يكون دقيقا لكن العديد منها “تم دمجها بمعلومات من مصادر أخرى لا تمت بصلة لمجموعة بنك قطر الوطني بما في ذلك بيانات شخصية من شبكات التواصل الاجتماعي.”

وتضمنت نسخة من المحتوى المسرب اطلعت عليها رويترز بيانات خاصة بصفقات أبرمها عملاء البنك أظهرت أيضا عوائد من الخارج ترجع إلى عمليات آخرها في سبتمبر أيلول 2015.

وتضمن أحد الملفات معلومات عما بدا أنها 465 ألفا و437 حسابا في البنك لكن جزءا ضئيلا فحسب من هذه الحسابات تضمن ما يشبه تفاصيل كاملة عن الحساب.


One million copies of an old movie encoded into DNA by inspecting a vial that contains a few droplets of water  


A Technicolor scientist surrounded by the latest virtual reality technology inspects a vial containing a few droplets of water — and one million copies of an old movie encoded into DNA.

The company has come a long way since the Hollywood golden age, when the world gazed in awe at the lush palette of “The Wizard of Oz” and “Gone with the Wind” provided by its three-strip cameras.

DNA Data Storage: Your Genetic Material Is A Hard Drive

Your Face Is Made Of Junk DNA!

DNA is almost unimaginably small — up to 90,000 molecules can fit into the width of one human hair — so even such a large library is totally invisible to the human eye. All you can see is the water in the tube.

“This, we believe, is what the future of movie archiving will look like,” Bolot said.

Scientists have been experimenting with DNA as a potential storage medium for years but recent advances in modern lab equipment have made projects like Technicolor’s a reality.

The company’s work builds on research by scientists at Harvard University, who in 2012 successfully stored 5.5 petabits of data — around 700 terabytes — in a single gram of DNA, smashing the previous DNA data density record by a factor of one thousand.

DNA is a long, coiled molecular “ladder” — the famous double helix structure — comprising four chemical rungs, adenine, cytosine, guanine and thymine, which team up in pairs.

DNA Data Storage Lasts Thousands Of Years

Bolot’s team digitized the “A Trip to the Moon” into data in the form of zeros and 1s in computing’s binary code, and transcribed it into DNA code, which was then turned into molecules, using lab-dish chemicals.

The contents are “read” by sequencing the DNA — as is routinely done today in genetic fingerprinting — and turning it back into computer code.

Converting movies into man-made DNA brings huge advantages, said Bolot, who points out that the archives of every Hollywood studio, currently taking up square kilometers of floor space, could fit into a Lego brick.

Another problem overcome by DNA storage is that the format for reading it doesn’t become obsolete every decade or so, unlike celluloid, VHS, DVD and every other medium in the history of filmmaking.


Schneider Electric highlights benefits of converged OT and IT


Schneider Electric is hosting its fourth ‘Power to the Cloud’ event in Dubai, to showcase the potential of converged operational technology (OT) and information technology (IT) for smart cities.

The event, which is taking place at the the Dubai Convention and Exhibition Centre, highlights how Internet of Things solutions that combine OT and IT will impact in areas such as energy optimization and improve citizen services through analytics, real-time data management and intelligence capabilities. 

The show will is expected to attract over 2,000 industry visitors and around 200 VIPs from across the Middle East, Europe and Africa region. Industry speakers include experts from Schneider and customers including Etisalat, DEWA, Al Futtaim, Movenpick Group, Starwood Hotels and Resorts, and Abu Dhabi Airports.

Saeed Al Tayer, managing director and CEO, Dubai Electricity and Water Authority (DEWA) gave a keynote speech, commented: “This year’s Power to the Cloud comes at a time of rapid infrastructure transformation and economic change. As we strive to become smarter and more connected than ever before, we need to learn to leverage technologies that are eco-friendly.

Today, the convergence of Information Technology (IT) and Operational Technology (OT) makes it possible to increase process efficiency and optimise scarce resources.

This ties in with the UAE Government’s Vision 2021 launched by His Highness Sheikh Mohammed bin Rashid Al Maktoum, Vice President and Prime Minister of the UAE and Ruler of Dubai, which emphasises the importance of sustainable development and the preservation of the environment.” 

The event includes a dedicated 5,000 square meter experiential zone which showcases various elements of a smart city for homes, hotels, hospitals, educational institutions and utilities. 

Frédéric Abbal, executive vice-president of Energy Business, Schneider Electric, said: “It is important to create enriching community spaces for inhabitants through digital connection, technologies to simplify life and automation to streamline the businesses. These communities, which collectively form cities, need to be tied together with infrastructure that can accommodate the massively growing populations and their evolving expectations.

Power to the Cloud, now in its fourth year, has greatly developed since its inception in 2012 and illustrates the progress Dubai is making on its evolution into a smart city. Through this educational platform, we hope to bring to light the technologies that will positively impact our living spaces and their environment.


A massive Google cloud outage this week went largely unnoticed compared to the type of outcry that accompanies downtime for its competitors — and that’s not a good thing.

The incident was initially caused by dropped connections when inbound Compute Engine traffic was not routed correctly, as a configuration change around an unused IP block didn’t propagate as it should. Services also dropped for VPNs and L3 network load balancers. Management software’s attempts to revert to previous configuration as a failsafe triggered an unknown bug, removed all IP blocks from the configuration and pushed a new, incomplete configuration.

A second bug prevented a canary step from correcting the push process, so more IP blocks began dropping. Eventually, more than 95% of inbound traffic was lost, which resulted in the 18-minute Google cloud outage that was finally corrected when engineers reverted to the most recent configuration change.

The outage didn’t affect Google App Engine, Google Cloud Storage or internal connections between Compute Engine services and VMs, outbound Internet traffic, and HTTP and HTTPS load balancers.

SearchCloudComputing reached out to a dozen Google cloud customers to see how the outage may have affected them. Several high-profile users who rely heavily on its resources declined to comment or did not respond, while some smaller users said the outage had minimal impact because of how they use Google’s cloud.

Vendasta Technologies, which builds sales and marketing software for media companies, didn’t even notice the Google cloud outage. Vendasta has built-in retry mechanisms and most system usage for the company based in Saskatoon, Sask., happens during normal business hours, said Dale Hopkins, chief architect. In addition, most of Vendasta’s front-end traffic is served through App Engine.

In the five years Vendasta has been using Google’s cloud products, on only one occasion did an outage reach the point where the company had to call customers about it. That high uptime means the company doesn’t spend a lot of time worrying about outages and isn’t too concerned about this latest incident.

“If it’s down, it sucks and it’s a hard thing to explain to customers, but it happens so infrequently that we don’t consider it to be one of our top priorities,” Hopkins said.

For less risk-tolerant enterprises, reticence in trusting the cloud would be more understandable, but most operations teams aren’t able to achieve the level of uptime Google promises inside their own data center, Hopkins said.

Vendasta uses multiple clouds for specific services because they’re cheaper or better, but it hasn’t considered using another cloud platform for redundancy because of the cost and skill sets required to do so, as well as the limitations that come with not being able to take advantage of some of the specific platform optimizations.

All public cloud platforms fail, and it appears Google has learned a lesson on network configuration change testing, said Dave Bartoletti, principal analyst at Forrester Research, in Cambridge, Mass. But this was particularly unfortunate timing, on the heels of last month’s coming-out party for the new enterprise-focused management team at Google Cloud.

“GCP is just now beginning to win over enterprise customers, and while these big firms will certainly love the low-cost approach at the heart of GCP, reliability will matter more in the long run,” Bartoletti said.


Oracle ZFS Storage Appliance system overview

The Oracle ZFS Storage Appliance is designed for mid-tier NAS environments. The line has two products, the ZS3-2 and ZS4-4, that have variable setup options and a wide range of configurations. Oracle ZFS products support mechanical hard disk drives (HDDs) for data, and flash-based solid-state drives (SSDs) for metadata and write acceleration.

The Oracle ZFS Storage ZS3-2 fits up to 184 serial-attached SCSI (SAS) HDDs in capacities of 300 GB, 900 GB and 4 TB, for a maximum 736 TB of storage per single node. Drives are arranged in 24-slot disk shelves. Unlike many NAS arrays, the Oracle ZFS Storage Appliance does not support data SSDs. Instead, Oracle implemented a memory capacity of 1 TB per node, a read flash cache capacity of 12.8 TB and 28 TB of write flash using write accelerators. Write accelerators are drives that store the contents of the ZFS Intent Log (ZIL). Products support 1.6 TB SSDs for the read cache and 300 GB SSDs for the write accelerators.
The ZS3-2 supports up to four write flash accelerators per disk shelf. It can have four or eight 10 Gigabit Ethernet (GbE) Base-T ports depending on configuration. It runs on up to four eight-core Intel Xeon processors, and nodes can be clustered as high as 3.1 PB

The Oracle ZFS Storage ZS4-4 fits up to 544 SAS HDDs in capacities of 900 GB and 4 TB, for a maximum 2.1 PB of storage per single node. Drives are arranged in 24-slot disk enclosures, with support for up to four write accelerators per enclosure. The array can have up to eight 10 GbE Base-T ports, and runs on eight 15-core Intel Xeon processors and up to 3 TB of memory. ZS4-4 nodes can be clustered as high as 6.9 PB.
ZIL is an intent logging feature designed to increase data availability on ZFS platforms. Write operations to ZFS Storage drives are atomic, meaning they are either performed completely or not at all. A record of each operation — known as the “intent to perform” — is logged to the ZIL before it occurs. In the event of a power failure, the system will read the intent log to detect which operations were in process when the failure occurred and either revert or redo them. The ZIL is stored on flash-based SSDs, providing faster write performance than if it was written to mechanical HDDs.

In addition to ZIL, the Oracle ZFS Storage Appliance includes software for storage management, monitoring and encryption. The ZFS Storage Software has features such as thin provisioning, monitoring and analytics, support for iSCSI and Fibre Channel interconnects, and replication within local ZFS Storage clusters. Additional software can be licensed separately for remote replication, AES 256-/192-/128-bit encryption and database backup. The ZFS Storage Appliance is tuned to work with Oracle databases, and its software includes the Snap Management Utility for Oracle Database and the Oracle Enterprise Manager Plug-in for Oracle ZFS Storage Appliance.

Pricing for the Oracle ZFS Storage Appliance depends on configuration. Pricing for the ZS3-2 model ranges from $35,600 to $314,600, while the cost of the ZS4-4 is between $135,600 and $988,900. All ZFS Storage products come with a one-year limited hardware warranty with phone support during local business hours. Response times are tiered by severity: Severity 1 has a four-hour response time, Severity 2 has an eight-hour response time and Severity 3 has a next-business-day response. An optional premier warranty provides 24/7 technical support and two-hour on-site support. Oracle’s advanced support package provides further features such as 24/7 monitoring, system installation and support


Stratasys re-energizes 3D printing with push-button J750 that prints 360,000 colors

OtterBox has been using 3D printing to help design its tank-like phone cases for over a decade. But, the biggest leap forward in its rapid prototyping process happened in the past six months. A prototype of one of its multi-colored cases used to take 3 days to print, paint, and finish. Now, it takes 30 minutes with the Stratasys J750, which OtterBox has been beta testing since last fall.

As of Monday, any company can now take advantage of this technology to shorten its product development lifecycle. The J750 is available from Stratasys today and can be ordered from its website. Delivery times will vary based on geography.

To get the exact cost of a J750, you’ll have connect with Stratasys to get the specifics for your company and region, but you’re typically looking at a price tag in the hundreds of thousands of dollars for an industrial-strength rapid prototyping machine like this one.

Stratasys is also the company that owns MakerBot—the manufacturer of the world’s most well-known desktop 3D printers—and we can expect that advances in high-end “additive manufacturing” will also trickle down to consumer 3D printers eventually. My ZDNet colleague Larry Dignan analyzes what the J750 means for the 3D printing market.

The reason that the J750 represents such a breakthrough in 3D printing is that it can print 360,000 colors and a combination of 6 different materials. While there are 3D printers that can now print metal, wood, and even human cells, the J750 remains focused on combining a variety of different plastics to help manufacturers produce prototypes and parts.

By combining multiple materials into its prints, the J750 can achieve a lot of different strengths, textures, and opacities. And, the ability to print so many color combinations without having to change the printer’s configuration has surpassed anything else that has hit the market so far.

It’s a game changer. And, it has industrial designers drooling.

On the first day that the beta version of the J750 arrived at OtterBox last year, the team quickly printed one of their in-progress iPhone cases—just to see how it would look. The reaction was, “Whoa, this looks just like our final part,” said Brycen Smith, engineering technician supervisor. 

How GE is using 3D printing to unleash a revolution in large-scale manufacturing

In 2015, GE inaugurated a new, Multi-Modal manufacturing facility in Chakan, India. If the company’s ambitions for the space are realized, it could drive a massive change in global manufacturing.

For fun, they sent it to their testers to see how well the color matched to the company’s standards for the final product. “It was within our manufacturing tolerances,” said Smith.

“The day we got it in, one of our product development directors said, ‘Can we get 2?” he added.


You don’t have to be in management to succeed in business

Many a company founder has gladly relinquished the title of President or CEO to be a Chairman or a CTO or simply, a founder. What these individuals had in common was a love for innovation and for the ability to keep innovating and doing. They didn’t see the daily life of company administration, or interacting with boards, stakeholders and analysts, as particularly satisfying—nor did they want to be managing projects or product launches. However, because of their ability to innovate and create, they enjoyed rewarding and highly lucrative careers.

The bottom line? You don’t have to be in management to succeed in business.

These words continue to hold true even if you are not a skilled innovator, but are instead highly skilled in a discipline that your company regards as critical to its success. In the IT world, for example, there are data architects and scientists who start as new hires with six-figure salaries. There is similar recognition for the finely honed skills of engineers, application developers, and security analysts.

That’s important to know because business schools and companies continue to instill the idea that the path to success and monetary gain is through management. This is the ideology that compels those who are technically gifted to try to remake themselves so they can fit into positions that do not naturally line up with their talents. On the flip side, it is also the impetus behind the conversations that go on in technical expert cubicles about management spending time away at seminars so it can practice buzzwords.

The reality is that management and technical expert positions can be mutually exclusive because they require different skillsets. Managers, if they concentrate on technical problem solving, will neglect the most important parts of their jobs, like keeping their departments running, delivering on key business strategies, and ensuring funding so those strategies can be carried out. Technical experts and innovators deliver the value of what a department or a company offers through their genius and technology skills.

There are still companies and individuals who do not understand the importance of this dual-pronged approach to work. The refreshing news is that more organizations are starting to understand this idea. The way that they are showing it is by creating dual promotion ladders—one for management and one for technical contributors. Salaries between the two are commensurate.
This gives innovators and technical geniuses a career path, and it enables them to follow their natural bent—knowing that they can also obtain stock options, bonuses, high salaries, and corporate recognition.
Most importantly, it enables these individuals to be themselves—and to shine at what they do best.
The takeaways if you are a technical innovator or expert with absolutely no desire to manage people, politics, or budgets are to:
Develop your technical skillsets and/or talents in a particular area of need.

Find (or create) an organization that respects these skills and that will reward reward them.

Be the best at what you do.

Several years ago, I had lunch with a IT acquaintance who had been a database and application innovator for years, but who was routinely passed over whenever a management position came up. In his mid-forties, he realized that he would likely never advance in the company—which he felt had to be done by getting into management. One year later, he had gotten together with a few other highly developed techie friends and had founded his own company. Now in his core zone of excellence, he was abundantly happy, except for one thing—company growth now demanded that he had to hire a manager—which he was more than happy to d


18 ways to hack a human [Infographic]

What will the cause of your next security breach?
Will it be your firewall? 

Will it be your VPN?

Will it be your Chances are, your next security breach will be caused by hackers exploiting someone within your organization. 
In just the last two months, a single, simple phishing scam targeted seven organizations, gaining access to W2 information. And business email compromise attacks, in particular, are growing fast and hard to defend against. 
The fact is, it’s “easier to trick someone into opening an email and exploiting a vulnerability that way, or convincing an unsuspecting assistant to provide a few useful bits of information, than it is to directly attack a web application or network connection,” writes George V. Hulme in his Social Engineering Survival Guide. 
In person, by phone, or by email or other digital means, whatever the method of communication, hackers are using highly targeted tactics to take advantage of our feelings, emotions and relationships. 
But there are some simple things you can do to take the target off your back… starting with building your (and your users’) social engineering smarts.  




NTT Data, the IT services subsidiary of Nippon Telegraph & Telephone, has entered into an agreement to acquire Dell’s IT services business, a move that could make the Japanese company an important player in the U.S. market.
NTT is paying over US$3 billion for Dell Services, a profitable operation, a spokesman for Dell, David Frink, said in an email. Dell had earlier declined to comment on the price.
The sale by Dell could help the Round Rock, Texas company raise funds ahead of its proposed acquisition of data storage company EMC.
Dell Services became part of the company as a result of a $3.9 billion acquisition of IT services company Perot Systems in Plano, Texas in 2009.
Clients of Dell Services and NTT Data are expected to benefit as a result of the deal, which is subject to customary closing conditions and regulatory approvals, through an expansion of business process outsourcing capabilities, particularly in the areas of healthcare and insurance.
The customers will also have access to more infrastructure through the additions of Dell Services data centers in the U.S., U.K., and Australia to NTT’s own 230 data centers around the globe.
“There are few acquisition targets in our market that provide this type of unique opportunity to increase our competitiveness and the depth of our market offerings,” NTT Data CEO John McCain said in a statement. He will lead the combined business after the deal is through.
The companies did not disclose when they expected the acquisition to be complete. Dell has been trying to sell its services business for some time, and there have been reports previously that NTT was likely to be the buyer.


The Smart Dubai Platform will be the digital backbone for Dubai’s smart city vision

Dr. Aisha bin Bishr

Dubai announces Smart Dubai Platform.

Smart Dubai Office has officially launched the ‘Smart Dubai Platform’ which is intended to form the central application and control for Dubai Smart City.
The platform was announced by Dr Aisha Bin Bishr, director general, Smart Dubai Office, and Osman Sultan CEO of du, the lead strategic partner in the project.
The platform has been created with a focus on end-user experience, whether it is residents, visitors or businesses, and on six themes of economy, government, people, living, mobility, and environment.
The Smart Dubai Platform has been designed to manage the infrastructure and data generated by smart city systems, and will include features for ingestion, aggregation, storage, advanced analytics and sharing of data.
Dr Bin Bishr commented: “”To achieve our ambitious mandate, Smart Dubai has pioneered the most comprehensive blueprint globally, encompassing the whole city, not just one sector or district. We are unifying operations to enable impact across the city, from infrastructure connected to the Internet of Things, to open data and shared data, deriving insights and innovation.
“The Smart Dubai platform will become the digital backbone for our smart city, uniting city infrastructure, open and shared data, enabling services and city-wide smart applications, the platform will become the central operating systems of Dubai. It will be unlike any other smart city platform operating in the world today,” she added.

Dubai’s approach to smart cities is unique, Dr Bin Bishr added, because the city leadership is aiming to develop a single unified smart city, rather than implementing a number of different unconnected projects. du was selected as the lead partner in the project because of its understanding of this landmark approach, its ongoing smart initiatives, invest in smart infrastructure and support for entrepreneurship and innovation.
Osman Sultan said: “A city can have smart applications that co-exist, but that does not make it a smart city, and certainly doesn’t make it the smartest city.”
Part of the unified approach will be able to take data from a wide range of sources and make it available to interested parties, Sultan added. The Dubai Open Data Law, announced in October, already outlines the governance of data sharing and data management in this smart city model, he said.
With the data and analytics in place the platform will enable organisations and individuals access personal dashboards, dashboards for decision makers, and business to provide in depth analysis of the data. The is intended to significantly enhance real-time and data-driven decision making capabilities for the city government, enabling city leaders to engage in community-wide dialogues and analyse rich city data across multiple dimensions. The platform will also enable the continued enhancement of existing smart initiatives and services through data-driven analysis and innovation, and will allow organisations to build smart applications, Sultan added.
“The city be able interact with these new ways of doing things, the city will have access to the huge amount of data, and they city will be able to do better planning, better anticipation, the city will be able to see the trends and how people interact with the place,” he said.
du will lead a consortium of technology vendors and consultants which includes HP, Cisco, HortonWorks, EMC, Informatica and MicroStrategy on the project.


How to get back to work after a career break

How to get back to work after a career break | Carol Cohen


HP IDOL: the OS for human information


HPE Reimagine 2016 – Dubai – Madinat Jumeirah

City Safe Solution’s Reference Architicture – Building Blocks and Use cases

HPE announced the USE CASE of city of Auckland, New Zealand that deployed HP Software to deliver a visionary Big Data project designed to provide a safer community and more efficient roadways for its citizens.
Auckland Transport, Auckland’s government agency responsible for all of its transportation infrastructure and services, will deploy video analytics powered by HP IDOL on servers and storage from HP Enterprise Group, and with support from HP Software Professional Services.
“The safety and well-being of our citizens is always our top priority and the Future Cities initiative is a big step in the right direction,” said Roger Jones, CIO Auckland Transport. “Only HP could comprehensively deliver the custom solution, expertise and ecosystem at this scale to transform our vision into reality.”
Auckland Transport (@AklTransport) will use HP’s integrated big data platform, HAVEn, to analyze, understand and act on vast quantities of data of virtually any type including text, images, audio and real-time video. The system will leverage data from a variety of sources, including thousands of security and traffic management cameras, a vast network of road and environmental sensors as well as real-time social media and news feeds.    
In the first phase of the project, Auckland Transport focused on improving public safety. Law enforcement will use HP Intelligent Scene Analysis System and license plate recognition for accurate identification and scene analysis for dangerous activities and analyzing safety threats from over 2,000 cameras deployed within Auckland. Going forward this information will be linked with rich insight from social media news sources to provide a comprehensive solution that can proactively identify breaking trends and respond to critical safety incidents for cyclists and transport users.
HP Enterprise Group has supplied the hardware infrastructure for Auckland Transport, a combination of powerful servers and storage systems. HP Proliant Gen8 BladeSystem, HP 3PAR StoreServ Storage, HP StoreAll Archive and HP FlexFabric will give Auckland Transport the most advanced hardware, providing superior capabilities for the safe city initiative. 
HP Software Professional Services is instrumental in the process, lending support and expertise to ensure a swift and smooth implementation. 
HP partner VidSys (@VidSys), a global leader in physical security information management systems, has provided a platform that unifies the control and monitoring functions of physical security, building and traffic management, and computer aided dispatch systems. 
Auckland Transport’s investment in big data technologies from HP is part of a larger trend around the emergence of “Smart Cities.” Enlightened city planners are looking at how to leverage big data, sensor data, and data from people and their devices to create improved products and services for citizens.

Advanced Video Analytics  


Intelligent scene Analysis


 HP IDOL – Unique Platform for Information that Handel Human continuum accompanied with social media analysis



٢٠ ميزه في نظام iOS9


Get every new post delivered to your Inbox.

Join 295 other followers