Originally Posted On: techrepublic.com

Experts predict companies will continue to hire cybersecurity, AI, and developer roles throughout the year.

The tech jobs landscape of 2019 will likely look largely the same as it did in 2018, with roles in software development, cybersecurity, and data science dominating across industries.

“Emerging technologies will be key catalysts for the in-demand jobs we expect to see in 2019,” said Sarah Stoddard, community expert at job search site Glassdoor. “From artificial intelligence, automation, virtual reality, cryptocurrency and more, demand for jobs in engineering, product, data science, marketing and sales will continue to rise in order to support the innovation happening across the country.” 

More and more often, traditional companies are beginning to resemble tech companies, and this trend will likely continue throughout the next year, Stoddard said. “As employers across diverse industries, from health care to finance to automotive and more, continue to implement various technologies to streamline workflows and boost business, the demand for top-notch workers who have a balance of technical and soft skills will continue to rise.” 

SEE: The future of IT jobs: A business leader’s guide (Tech Pro Research)

Here are 10 of the most in-demand tech jobs of 2019, according to recruiters and career site experts. 

1. Cybersecurity engineer

Security is a major concern for companies and consumers alike in our connected world, said Marc Cenedella, CEO and founder of executive job search site Ladders

“Because of this emphasis on organizational safety, we’re seeing a huge upswing in the number of security engineer jobs meant to be the first line of defense to safeguard lucrative products and services,” Cenedella said. 

Internet of Things (IoT) security will become a particular area of focus, as connected devices become staples in daily life and cybercriminals look to exploit them, said Stephen Zafarino, vice president of national recruiting for recruiting agency Mondo. “Companies are definitely looking to figure out how we can protect these new products that we’re putting online and make sure they’re not a vulnerability,” Zafarino said. 

2. AI/machine learning engineer

The explosion in artificial intelligence (AI) and machine learning technologies across the enterprise has led to increased demand for these professionals. “Everyone’s trying to figure out ways to optimize their businesses and their practices, and how to automate and make their day-to-day lives a little bit easier, or a little bit more productive and functional,” Zafarino said. 

3. Full stack developer

Full stack developers are among the most in-demand by employers right now in terms of open job postings, according to data from job search site Indeed

“Some companies are moving away from siloed back-end and front-end development teams, which requires hiring developers who can work on all levels of the application stack,” said Paul Wallenberg, head of technology recruiting services at staffing and recruiting firm LaSalle Network

SEE: Job description: Data scientist (Tech Pro Research)

4. Data scientist

Named the no. 1 best job in America by Glassdoor for the past three consecutive years, data scientists are expected to remain in high demand in 2019, as nearly every company now has the ability to collect data, and all need employees who can effectively organize and analyze this information. 

“Companies are continuing to increase their own proprietary data, but are also looking at ways to incorporate third-party data to understand problems impacting their business, and having data science competencies internally enables them to do that,” Wallenberg said. 

5. Python developer

The rise of AI and machine learning technologies has led to increased demand for Python developers in the enterprise, Zafarino said. The fastest-growing programming language, Python is also relatively easy to learn, and has a large developer community. 

SEE: Hiring kit: Python developer (Tech Pro Research)

6. Java developer

Java developers will remain in high demand in 2019, according to data from Indeed and Glassdoor. Despite the growth of programming languages like Python and R, Java continues to dominate the enterprise, with the growth of the cloud keeping it on top. 

7. JavaScript developer

JavaScript also remains popular in the enterprise, and will continue to in the new year. “Companies that have development teams structured between front-end and back-end teams are hiring technologists whose strengths lie in using various JavaScript libraries and frameworks to deliver more compelling user interfaces,” Wallenberg said. 

8. Cloud engineer 

Job postings that include the terms cloud computing or cloud engineer have gone up 27% since 2015, according to Indeed. “As companies move away from an on-premise infrastructure model to a cloud-first approach when upgrading or designing new environments, the need to hire technologists with cloud experience has increased dramatically,” Wallenberg said. 

9. Scrum master

Organizations are increasingly turning to Scrum to organize software development, and this method will break out even more in 2019, Cenedella said. “Thousands of companies are hiring so-called scrum masters for the purposes of achieving excellence in self-organization and making changes quickly in their Agile environments,” he added.

10. DevOps engineer

As the DevOps workflow grows increasingly popular, more organizations are seeking DevOps engineers, according to Indeed. The number of job postings mentioning DevOps rose from less than 1% in 2012 to more than 24% in 2017, another Indeed report found. These professionals also ranked no. 2 on Glassdoor’s 2018 Best Jobs in America list.

Originally Posted On: singularityhub.com

We tend to compartmentalize our understanding of the world into “subjects.” From a very young age, we are misled to believe that science is separate from art, which is separate from history, which is separate from economics, and so on.

However, a true understanding of the world and our place in it requires interconnections between many disciplines and ways of thinking. Global challenges, whether they be climate change or wealth inequality, cannot be tackled with a single isolated discipline, but rather require a convergence of subjects and thinking tools.

Such is where the convergence of science and arts is actually more natural than is often assumed. STEAM is an educational approach to learning that combines science, technology, engineering, the arts, and mathematics. The arts in this context refer not only to the fine arts, but also to the liberal arts and humanities.

STEM education by itself misses the development of critical 21st-century skills that are required as we head towards the Imagination Age and a creative economy. STEAM doesn’t only result in more meaningful learning, but also contributes to more divergent thinking, and consequently, creative innovation. It is also a powerful tool for communicating scientific thought and global issues to the general public.

The below artistic projects celebrate harmony between science and the arts.


Art can be used as a powerful call for action. One strategy is to use art to induce fear in audiences by demonstrating the consequences of inaction. An equally powerful approach is to inspire audiences by providing an exciting vision for the future.

An interactive and immersive exhibition at the World Government Summit in Dubai, Climate Change Reimagined, envisions a desirable future where we have “not only survived the challenges of climate change in the mid-21-st century, but have thrived.” The exhibition highlights the global threat of climate change and the urgent need for innovative solutions. It reframes the biggest problems contributing to humanity’s ecological footprint as an opportunity for radical innovation. 


It’s rare to meet someone who doesn’t have some appreciation for music. While there are countless genres out there, a love for music appears to be a universal human trait. Symphony of Science is a project that aims to “spread scientific knowledge and philosophy through musical remixes.”

Created by Washington-based electronic musician John D. Boswell, the remixes include audio and video samples from television programs featuring popular scientists, such as Carl Sagan, Richard Feynman, Neil deGrasse Tyson, Bill Nye, Stephen Hawking, and many more.

“Waves of Light” is a remix video that celebrates the beautiful fact that we are able to study the entire history of the universe through the power of light. Its chorus is:

Gaze up into the night sky. Capture the light and read the story of the universe.Isn’t it a wonderful thing? We are part of the universe. Isn’t it a wonderful thing? The story of the universe is our story. Carried on waves of light.Wave after wave after wave of light. All the colors of the rainbow, colors of the rainbow.


The vastness of the universe can often leave us feeling small and insignificant, but then we must remember that we are astronomically massive compared to the microscopic or even quantum world. At such minuscule scales, we are the universe.From particles popping out of nowhere to light appearing as particles and waves, the quantum world is both wondrous and confusing. It breaks down all of our intuitions about the nature of reality. In the words of legendary physicist Richard Feynman, “If you think you understand quantum mechanics, you don’t understand quantum mechanics.” The quantum world is impossible to observe directly and is studied by indirect methods of observation, such as measuring the remains of proton collision in the laboratory.With his short film Quantum Fluctuations: Experiments in Flux, artist Markos Kay has taken on the challenge of the impossible with a daring attempt to animate and visualize the quantum world. With visuals and music that appear to be from a different dimension, Kay visualizes various quantum phenomena such as particle decay and proton showers.


How do the mind and body work to make us feel pain? What role does our psychology play in our physical sensations? What is the evolutionary advantage of pain? Thanks to the hard work of researchers, we have an expansive accumulation of studies and theories that can shed light on these questions—but what better way to communicate the answers than to use art?Exhibited at the MOD Futures Gallery in Australia, “Feeling Human” is an immersive multi-sensory exhibit that uses different technologies to give visitors experiences that provide a window into findings from centuries of pain research. In the process, attendees of the exhibit learn about the biological, psychological, and social influences of pain. As the gallery says, “Welcome to a dark, sensory world where stories of pain come to life…” To feel pain is to feel human, and a better understanding of the nature of pain can contribute to a better understanding of ourselves. 


An art project from 2017’s Burning Man, Midnight Star is an experiential art piece that aims to give burners a cosmic perspective. All night long, seven illuminated rings pulse rhythmically with ambient outer space sounds, and at midnight, the Big Dipper’s stars glide into perfect alignment, with seven red rings of the installation hovering above the playa.Embedded into the installation experience is a midnight ritual. The artists and physicists of the team invite participants to take “a guided meditative tour of the night sky.” The experience combines secular spirituality, music (original scores that integrate space recordings and excerpts from talks by notable astrophysicists), and physics. In the process, participants explore the nature of the universe and take a moment to appreciate our place within it all.Every year, Burning Man’s incredible installations are possible because of the collaboration between artists and scientists. They are not only symbols of radical self-expression, but also human ingenuity.


We owe much of human advancement to those who have dedicated their lives to the pursuit of knowledge and understanding. The scientific method continues to lead to greater understanding of ourselves and the world, which in return fuels innovation and global progress.In the 1997 American film adaption of Carl Sagan’s Contact, SETI Scientist Dr. Elli Arroway realizes the power of art after she travels through multiple wormholes to see what appears to be signs of an advanced civilization on another planet. The sheer awe of the experience leads Dr. Arroway to say, “Some kind of celestial event. No—no words. No words to describe it. Poetry! They should have sent a poet.” It’s no shock that Elon Musk and his SpaceXteam are sending artists to the moon.Scientific advancement on its own is not enough. We need philosophers to help us explore the implications of groundbreaking scientific findings. We need filmmakers to allow us to visualize and imagine counter-intuitive insights. We need poets to highlight the awe and wonder that is associated with it all. The convergence of science and art will help us advance as a species.Image Credit: NASA images / Shutterstock.comAbout Author: Raya is the Founder & CEO of Awecademy, an online platform that gives young minds the opportunity to learn, connect and contribute to human progress. She is a writer and regular speaker on the topics of innovative education, the future of work and the effects of exponential technologies on society.

Originally Posted On: threatpost.com

Called BleedingBit, this vulnerability impacts wireless networks used in a large percentage of enterprise companies.


Two zero-day vulnerabilities in Bluetooth Low-Energy chips made by Texas Instruments (and used in millions of wireless access points) open corporate networks to crippling stealth attacks.

Adversaries can exploit the bugs by simply being approximately 100 to 300 feet from the vulnerable devices. A compromised access point can then lead to an attacker taking control of the access point, capturing all traffic, and then using the compromised device as a springboard for further internal attacks.

The issue impacts Wi-Fi access points made by Cisco, Cisco Meraki and Hewlett-Packard Enterprise’s Aruba, accounting for a large percentage of hardware used in corporations, according to researchers at Israeli security firm Armis. The firm discovered the two bugs earlier this year and publicly disclosed them on Thursday

“Attacks can be devastating and carried out by unauthenticated users who can exploit these bugs and break into enterprise networks undetected while sitting in the company’s lobby,” said Ben Seri, head of research at Armis.

Texas Instruments released patches (BLE-STACK SDK version 2.2.2) for affected hardware on Thursday that will be available via OEMs. Cisco is expected to release patches for three Aironet Series wireless access points (1542 AP, 1815 AP, 4800 AP), along with patches for its Cisco Meraki series access points (MR33, MR30H, MR74, MR53E), on Thursday. And Aruba has released a patch for its Aruba 3xx and IAP-3xx series access points.

According to Aruba, “the vulnerability is applicable only if the BLE radio has been enabled in affected access points. The BLE radio is disabled by default.”

Cisco representatives told Threatpost that the BLE feature is disabled by default on its Aironet devices.

Aruba is advising its affected customers to disable the BLE radio to mitigate the vulnerability.

“Fixed software was published for all of Cisco’s affected products prior to Nov. 1. A PSIRT advisory was published at the time of the researcher’s disclosure today via our established disclosure page. Meraki also published an advisory in the customer dashboard, and documentation is available to disable to involved settings,” Cisco said in an email to Threatpost.

“The vulnerability can be exploited by an attacker in the vicinity of the affected device, provided its BLE is turned on, without any other prerequisites or knowledge about the device,” according to researchers. The attacker does not need to be on the network, he or she just needs to be within range of access point and the BLE broadcasts/beacons.

The first vulnerability (CVE-2018-16986) is tied to Texas Instrument chips cc2640/50 used in Cisco and Cisco Meraki access points. This vulnerability is a remote code-execution flaw in the BLE chip and can be exploited by a nearby unauthenticated hacker.

“First, the attacker sends multiple benign BLE broadcast messages, called ‘advertising packets,’ which will be stored on the memory of the vulnerable BLE chip in targeted device,” researchers said. “Next, the attacker sends the overflow packet, which is a standard advertising packet with a subtle alteration – a specific bit in its header turned on instead of off. This bit causes the chip to allocate the information from the packet to a much larger space than it really needs, triggering an overflow of critical memory in the process.”

Leaked memory is then leveraged by attackers to facilitate the running of malicious code on the chip. A backdoor is opened up on the chip, which an attacker can then use to command the chip wirelessly. From there, he or she can manipulate the main processor of the wireless access point and take full control over it locally and then remotely.

“The Texas Instrument chips are so common that an attacker could simply walk into a lobby of a company, scan for available Wi-Fi networks and begin the attack, on the assumption the BLE vulnerability is present,” said Nadir Izrael, CTO and co-founder of Armis.

A second vulnerability (CVE-2018-7080) was discovered by Armis in Texas Instrument’s over-the-air firmware download feature used in Aruba Wi-Fi access point Series 300 that also uses the BLE chip.

“This vulnerability is technically a backdoor in BLE chips that was designed as a development tool, but is active in these production access points,” according to Armis. “It allows an attacker to access and install a completely new and different version of the firmware — effectively rewriting the operating system of the device.”

Researchers said the second vulnerability exists because the over-the-air security mechanism can’t differentiate between “trusted” or “malicious” firmware updates. By installing their own firmware update, an attacker can gain a foothold on the hardware and take over the access points, spread malware and move laterally across network segments, researchers said.

The vulnerabilities were collectively given the name BleedingBit from the way researchers were able to overflow packets at the bit level in the BLE memory module.

BLE is a relatively new Bluetooth protocol designed to for low-power consumption devices such as IoT hardware. It’s significant for a number of reasons, such as its mesh capacities, but also for the fact it evolves the protocol from consumer uses (headphones and smartphone data transfers) to commercial IoT uses.

For this reason, Seri said there is concern that the BleedingBit vulnerabilities could impact a larger universe of BLE devices, such as smart locks used in hotel chains and point-of-sale hardware.

Last year, Armis discovered a nine zero-day Bluetooth-related vulnerabilities, dubbed BlueBorne, in Bluetooth chips used in smartphones, TVs, laptops and car audio systems. The scale of affected devices was massive, estimated to impact billions of Bluetooth devices.

(This article was updated with a comment from Cisco Systems on Friday 11/2 at 1pm ET)

Originally Posted On: techrepublic.com

The explosion of data in consumer and business spaces can place our productivity at risk. There are ways you can resist drowning in data.

The pace of data creation steadily increases as technology becomes more and more ingrained in people’s lives and continues to evolve.

According to Forbes.com last May, “there are 2.5 quintillion bytes of data created each day at our current pace, but that pace is only accelerating with the growth of the Internet of Things (IoT). Over the last two years alone 90 percent of the data in the world was generated.”

While technology should make our lives easier, the information it provides can negatively impact our mental function by overwhelming us with too much input.

However, don’t confuse cognitive overload with work overload. Whereby work overload is simply having too much to do and not enough time to complete it, cognitive overload refers to having too much information to process at once.

SEE: Leadership spotlight: How to make meetings worthwhile (Tech Pro Research)

Fouad ElNaggar, co-founder and CEO of Sapho, an employee experience software provider based in San Bruno, Calif., is passionate about cognitive overload. Together we developed some tips for workers on how to fix the problem.


The irony of productivity applications is that they can actually make you less productive. Microsoft Office includes Outlook, an email application, which can “helpfully” notify you when new email arrives.

Sadly, this can also contribute to your information overload if you’re in the middle of a task, and you switch to Outlook to read an email. You might even forget about the current task you’re working on. Instant messaging apps, or frankly, anything that dings or pops up an alert are just as distracting. When trying to stay focused on a task, close or shut off any applications which could serve as potential distractions. Oh, and silence your phone, too.


If you can’t close a potentially distracting application because you need it available, you can still quiet it down. Between Slack, Gchat, calendar, email and text messages, it probably seems like those tiny dialog boxes pop up on your screen all day long. Take a few minutes to evaluate which push notifications actually help you get work done, and turn off the rest.

SEE: Project prioritization tool: An automated workbook (Tech Pro Research)


Constantly checking and responding to email is a major time drain. Set aside two times a day to answer emails, and do not check it any other time. Put your phone on “Do Not Disturb,” and make it a point to not let notifications interrupt you during that time.


It’s easy and tempting to check social media, or your favorite news outlet while working, especially if you’re waiting for a task to finish before you proceed (such as rebooting a server or uploading a file). However, this just puts more data into your current memory banks, so to speak, so that instead of thinking about that server patching project now you’re also thinking about the NFL draft or how many people “like” your funny Facebook meme. Save social media for lunch time or after work. It’ll be more meaningful, and you can keep your work and recreation separate, as it should be.


I keep a very minimalistic workspace: a family picture, a Rick Grimes (from “The Walking Dead,” which contains many parallels to IT life) keychain figure, and a calendar. No fancy furniture, no posters, no inspiring slogans, and no clutter. This helps me stay oriented to what I need to do without the sensory overload.

I also apply the same principles to my computer: I only keep programs running which I need, and even close unnecessary browser tabs, SSH sessions, and Windows explorer windows so that I’m only concentrating at the task at hand.

SEE: IT jobs 2018: Hiring priorities, growth areas, and strategies to fill open roles (Tech Pro Research)


You may not have a choice, but avoiding to multitask is one of the best things you can do to keep your brain from being overwhelmed. Dividing your attention into four or five parallel tasks is a sure-fire way to ensure that those tasks take longer or end up being completed less efficiently than if you accomplished these things one at at time. Worse, it’s all too easy to drop tasks entirely as your attention span shifts, resulting in uncompleted work.


Document your to-do lists, operational processes, and daily procedures you need to follow (building a new server, for instance) so that you don’t rely on memory and can quickly handle tasks—or better yet—refer them to someone else. Anytime I discover how something works or what I can improve upon I update the related electronic documentation so I don’t have to comb through old emails, leaf through handwritten notes, or worse, ask coworkers or fellow employees to fill in missing details that I should have recorded.


In addition to relying upon established documentation to make your efforts more productive, take notes during difficult operations such as a server recovery effort or network troubleshooting endeavor. It helps to serve as a “brain dump” of your activities so that you can purge them from memory and refer to this information later, if needed.

Believe me, there’s nothing more challenging then sorting through a complex series of tasks during an outage post-mortem to recall what you did to fix the problem. A written record can save your brain.

SEE: Comparison chart: Enterprise collaboration tools (Tech Pro Research)


This should be a no-brainer, yet too many people consider themselves too busy to take a break, when doing so allows you to step away from work and hit the “pause” button. It’s not just about relaxing your brain so that you return to work with a more productive mindset, but a quick walk around the building might be beneficial in allowing you to think and come up with new ideas or solutions to problems you’re facing, thereby eliminating one more area of information overload.


I’ve written about some of the problems of the infamous (and unfortunately common) open-seating plan in companies. In a nutshell, having no privacy and sitting in close physical and audial proximity even to individuals considered close friends strains working relationships and breeds frustration.

Avoiding cognitive overload isn’t just about not taking on or dealing with too much at once, but it’s also about not letting other people’s activities intrude upon your own productivity. Whether it’s an annoying personal phone call, playing music or even just chewing loudly, other people’s nearby activity can be a source of unwanted details, which reduces your capacity to do your job. You may not have a choice about sitting in an assigned open space seat, but take advantage of opportunities such as working from home, using an available conference room, or moving to an empty part of the office when you really need to focus.


Facing the entirety of a complex project is a daunting mission. It’s better and more effective to break a project down into subcomponents, and then focus on these separately, one at a time.

For instance, say you want to migrate users, computers, and services from one Active Directory domain to another. This would be overwhelming to focus on at once, so the best way to proceed is to divide the project into tasks. One task could be migrating user accounts and permissions. The next task could be migrating computer accounts, and the task after that could be addressing DNS changes, and so on. Plan it out in advance, and then tackle it piece-by-piece.


Don’t let colleagues fill in your day with meaningless meetings. Have a conversation with your coworkers about which meetings are absolutely necessary for you to participate in and skip the rest. If you are a manager or leader, encourage your employees to schedule in-person meetings only when they are absolutely necessary.


You spend enough time on screens during the day. The simple act of charging your phone in another room gives you time to really disconnect. It also gives you a chance to wake up refreshed, and think about the day ahead before reactively reaching for your device and checking social media or email.

SEE: Research: The evolution of enterprise software UX (Tech Pro Research)


ElNaggar and I also thought of a couple of tips for business leaders on ways to reduce cognitive overload for their team. These tips include:


Take the time to learn what processes or tools are pain points for your employees’ productivity. Research which solutions can automate certain tasks or limit daily distractions and implement them across your workforce.


ElNaggar says that leaders “embrace the idea that employee experience matters, which will have a ripple effect in their organization.” He recommends that leaders start to develop more employee-centric workflows that reduce interruptions for their employees to help them focus on priorities and accomplish more work.

An example of an employee-centric workflow would be a business application or web portal, which gives employees a single, actionable view into all of their systems and breaks down complex processes into single-purpose, streamlined workflows, allowing employees to be more productive.

“Without leadership teams championing an employee-centric mindset, nothing will really change in the mid and lower levels of a company. Business leaders must start thinking about the impact their employees’ digital experience has on their work performance and overall satisfaction, and support the idea that investing in employee experience will drive employee engagement and productivity,” ElNaggar concluded.

Originally Posted On: informationweek.com

Agile, DevOps, Continuous Delivery and Continuous Development all help improve software delivery speed. However, as more applications and software development tools include AI, might software developers be trading trust and safety for speed?

The software delivery cadence has continued to accelerate with the rise of Agile, DevOps and continuous processes including Continuous Delivery and Continuous Deployment. The race is on to deliver software ever faster using fewer resources. Meanwhile, for competitive reasons, organizations don’t want to sacrifice quality in theory, but sometimes they do in practice.

Recognizing the need for speed and quality, more types of testing have continued to “shift left.” Traditionally, developers have always been responsible for unit testing to ensure the software meets functional expectations, but today, more of them are testing for other things, including performance and security. The benefit of the shift-left movement is the ability to catch software flaws and vulnerabilities earlier in the lifecycle when they’re faster, easier and cheaper to fix. That’s not to say that more exhaustive testing shouldn’t be done; shift-left testing just ensures that fewer defects and vulnerabilities make their way downstream.

Enter AI. More developers are including artificial intelligence in their applications, and they’re also using more AI-powered tools to do their work. Granted, not all forms of AI are equally complex or intelligent; however, the level of intelligence embedded in products continues to increase. The danger is that developers and software development tool vendors are racing to implement AI without necessarily understanding what it is they’re implementing or the associated risks.

“In my first foray into applied AI, we had to consider the implications of interfacing to triply-redundant flight control systems and weapons that kill,” said Gregg Gunsch, a retired US Air Force lieutenant colonel and retired college professor with over 20 years of experience teaching and leading research in applied artificial intelligence and machine learning, information security for computer science/engineering majors and digital forensic science. “That tended to instill a strong ‘seriously test before release’ attitude.”

Not every developer is building software with life-and-death consequences, but many are building applications and firmware that can have material impacts on end users, the enterprise, customers, partners, governments and more. Given that some forms of AI may yield unpredictable results because of the way they’re designed or because there are flaws or bias in the data, the question is whether the ongoing quest for ever-faster software delivery is practical, and if it is, whether it’s wise.

“I get concerned about putting guardrails in now, or we may miss what’s happening and then realize where the bots went wrong,” said Scott Likens, new services and emerging tech leader at PwC.

Value drives the need for speed

Part of the continuous process mantra is delivering value quickly to consumers for competitive reasons. However, quick execution and an “innovation at any cost” mentality also produces broken user experiences and functional gaffes that end users would happily trade for better quality software that’s delivered less frequently.

“I am very tired of being an uninformed beta test subject, but I recognize that crowdsourcing of some kinds [must] happen to collect the data necessary for training the systems and steering development. Rapid-prototyping around the user is a key tool in design engineering,” said Gunsch. “Sometimes, there may not be other good ways to collect the massive amounts [of data] needed for learning systems besides just experimenting on the entire user population.”

Attitudes about speed-quality tradeoffs differ around the world. According to PwC’s Likens, speed trumps quality in China, but the same is not true in the U.S.

“We had the social media wave where consumers wanted that instant change, but now they’re almost revolting against how often things change,” said Likens.

User attitudes also vary based on the nature of the application itself. For example, consumers expect banking applications to be reliable and secure, but they don’t have the same expectations of a social media selfie app.

“You’ve had data breaches and data leakage and now consumers are willing to accept less to be protected,” said Likens. “You can’t innovate at all costs for core enterprise [or core consumer] apps.”

Will AI help or hinder software delivery speed?

The potential risks of self-learning AI seem to indicate that trust should be included in shift-left practices and software development processes in general. While it may add yet another factor to consider earlier in the software development lifecycle (SDLC), embedding trust into processes would help ensure that this new element is executed efficiently. In fact, AI may be part of the solution that ensures that trust is not only contemplated but validated and verified.

Already, AI is being used in parts of the SDLC, such as automated software testing tools that use AI to ensure better test coverage and to prioritize what needs to be tested. It’s also driving higher levels of efficiency by enabling more tests to be run in shorter timeframes.

According to Likens, machine learning and computer vision can produce effective UI designs because the system can look at far more permutations than a human could and generate code from it.

“Now you’re seeing AI is generating code at a level that’s human-usable. A lot of stuff we do on the UI we can do at a high quality level because we feed in unbiased training data and hand-drawings, something that machine learning vision can recognize as something that looks good,” said Likens.

Not all aspects of software development and delivery have been automated using AI yet, but more will be automated over time as tools become more sophisticated and software development practices continue to evolve. AI can help accelerate software delivery, but its application will be more valuable if that speed can be matched with elements of quality which include security and trust.

For more on the trends in software development and AI, check out these recent articles.

Lisa Morgan is a freelance writer who covers big data and BI for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include … View Full Bio

Originally Posted On: blog.360totalsecurity.com

On October 18, 2018, 360 Threat Intelligence Center captured for the first time an example of an attack using the Excel 4.0 macro to spread the Imminent Monitor remote control Trojan. Only 10 days after the security researchers of Outflank, a foreign security vendor, publicly used Excel 4.0 macros to execute ShellCode’s exploit code for the first time on October 6, 2018. Although Excel 4.0 macro technology has been released for more than 20 years, and is often used to make macro viruses early in the technology, in fact, Microsoft has long used VBA macros (Visual Basic for Applications) instead of Excel 4.0 macro technology. This leads to Excel 4.0 macros not being well known to the public. Also, because Excel 4.0 macros are stored in the Workbook OLE stream in Excel 97-2003 format (.xls, composite binary file format), this makes it very difficult for anti-virus software to parse and detect Excel 4.0 macros.

360 Threat Intelligence Center analyzed in detail how Excel 4.0 macros are stored in Excel documents, and through in-depth research found that after using some techniques to hide Excel 4.0 macros and perform some specially processed ShellCode, you can completely avoid almost all antivirus. The software statically and dynamically kills and executes arbitrary malicious code. Since the new utilization technology based on Excel 4.0 macro has been published, and the use of this technology to spread the remote use of the remote control Trojan has emerged, 360 Threat Intelligence Center released the analysis report and reminded to prevent such attacks.

360 Threat Intelligence Center constructed an Exploit sample that can execute any malicious code remotely by deeply analyzing how Excel 4.0 macros are stored in a composite binary file format. After testing, it is found that many well-known anti-virus software cannot detect such samples.

The analysis of attack samples for disseminating Imminent Monitor remote control Trojan
360 Threat Intelligence Center first captured the attack sample of the Imminent Monitor remote control Trojan using Excel 4.0 macro on October 18, 2018. Only one anti-virus software can be killed on VirusTotal:

2 4

The Excel 4.0 malicious macro code is hidden in the table, and the Excel 4.0 macro code by selecting Unhide can be seen below:

3 3

The macro code will be from:
Hxxps://jplymell.com/dmc/InvoiceAug5e1063535cb7f5c06328ac2cd66114327.pdfDownload the file suffixed with PDF and execute it. The file is actually a malicious msi file. After execution by msiexec, it will decrypt and release a .NET type executable file in the %temp% directory, named 033ventdata.exe and execute:

5 3

vBM= in Form1 will call gRQ= function:

6 3

The gRQ= function will first obtain some configuration information, including the CC address to be connected: linkadrum.nl, and determine whether the current process path is “%temp%\ProtectedModuleHost.exe”, if not, move the current file to the directory, and Delete the current process file:

7 3

If the process path is consistent, the corresponding LNK self-starting file is generated in the startup directory to implement self-starting:

8 3

After that, start the process InstallUtil.exe and inject the host PE file of the Trojan:

9 3

The analysis of Trojan main control 
The injected Trojan master PE file is also a .NET program. After running, it will load 7z LZMA library DLL, and then call lzma library to decompress the Trojan host EXE carried by itself to load into memory. The EXE has strong confusion. After the memory is loaded, it will go online through linkadrum.nl and accept instructions to implement the complete remote control function:

10 2

After decompilation, you can also see the obvious string features: “Imminent-Monitor-Client-Watermark”

11 2

The Imminent Monitor RAT is a commercial remote control software. The official website is imminentmethods.net, which basically covers all remote control functions:

12 2

From the analysis of advanced attack events in recent years, it can be seen that due to the high cost of exploiting vulnerabilities such as Office 0day, most attackers tend to use the Office VBA macro to execute malicious code. For this open Excel 4.0 macro utilization technology will bring new challenges to killing and killing.

Enterprise users should be as cautious as possible to open documents of unknown origin. If necessary, disable all macro code execution by opening the File: Options – Trust Center – Trust Center Settings – Macros setting in Office Excel.

At present, 360 Threat Intelligence Center has supported the use of such attacks and samples of such exploit-free exploits. Additionally, 360 Threat Intelligence Center’s self-developed killing engine can also statically extract macros from attack samples and ShellCode exploit code:

13 3

Learn More About 360 Total Security Here.

From autonomous things and blockchain to quantum computing; how many of these technologies are you ready for?

This article originally appeared on ZDNet

Tech analyst firm Gartner has compiled a list of the top ten strategic technology trends that organisations need to explore in 2019. According to Garner, these technologies have substantial disruptive potential and are either on the edge of making a big impact, or could reach a tipping point in the next five years.

Some of these trends will be combined: “Artificial intelligence (AI) in the form of automated things and augmented intelligence is being used together with the Internet of Things (IoT), edge computing and digital twins to deliver highly integrated smart spaces,” explained Garner vice-president David Cearley.


1. Autonomous things

This includes robots, drones and autonomous vehicles that use AI to automate functions previously performed by humans. The next shift is likely to be from standalone intelligent things to swarms of collaborative devices working either independently or with human input, Gartner predicts. For example, a drone could decide that a field is ready for harvesting, and dispatch a robot harvester. “Or in the delivery market, the most effective solution may be to use an autonomous vehicle to move packages to the target area. Robots and drones on board the vehicle could then ensure final delivery of the package,” said Cearley.

2. Augmented analytics

Augmented analytics focuses on the use of machine learning to improve how analytics content is developed and used. Gartner said augmented analytics capabilities will quickly go mainstream as part of data preparation, data management, modern analytics, business process management, process mining and data science platforms. As it automates the process of data preparation, insight generation and insight visualisation, it could eliminate the need for professional data scientists in many scenarios.

3. AI-driven development

Developing applications with AI-powered features will become easier, Gartner said, using predefined AI models delivered as a service. Another shift is that AI will be used in the data science, application development and testing elements of the development process. By 2022, at least 40 percent of new application development projects will have AI co-developers on their team, the analyst firm predicts. “Tools that enable non-professionals to generate applications without coding are not new, but we expect that AI-powered systems will drive a new level of flexibility,” said Cearley.

SEE: Tech Pro Research: IT Budget Research Report 2019 (Tech Pro Research)

4. Digital twins

A digital twin refers to the digital representation of a real-world entity or system. By 2020, Gartner estimates there will be more than 20 billion connected sensors and endpoints, and digital twins will exist for potentially billions of things, helping companies to better understand their systems and business processes.

5. Edge computing

Edge computing is a growing area of interest, mostly for now driven by the IoT and the need keep the processing close to the edge of the network rather than in a central cloud server. Over the next five years, specialised AI chips, along with greater processing power, storage and other advanced capabilities, will be added to a wider array of edge devices, Gartner said. In the long term, 5G will offer lower latency, higher bandwidth, and enable more edge endpoints per square kilometre.

6. Immersive experience

Gartner looks beyond virtual reality and augmented reality to a future model of immersive user experience, where we connect with the digital world across hundreds of surrounding edge devices. These include traditional computing devices, wearables, cars, environmental sensors and consumer appliances.

“This multi-experience environment will create an ambient experience in which the spaces that surround us define ‘the computer’ rather than the individual devices. In effect, the environment is the computer,” said Cearly.

7. Blockchain

Current blockchain technologies and concepts are immature, poorly understood and unproven in mission-critical, at-scale business operations, said Gartner. Despite this, the potential for disruption means that CIOs should begin evaluating blockchain, even if they don’t aggressively adopt these technologies in the next few years.

8. Smart spaces

A smart space is a physical or digital environment in which humans and technology-enabled systems interact, like smart citiesdigital workplaces, smart homes and connected factories. Gartner said this area is growing fast, with smart spaces becoming an integral part of our daily lives.

9. Digital ethics and privacy

Companies need to proactively address issues around digital ethics and privacy. “Shifting from privacy to ethics moves the conversation beyond ‘are we compliant’ towards ‘are we doing the right thing,” said Cearly.

10. Quantum computing

It’s still very early days for quantum computing, which promises to help find answers to problems too complex for a traditional computers to solve. Industries including automotive, financial, insurance, pharmaceuticals and the military have the most to gain from advancements in quantum computing, and Gartner said that CIOs should start planning for it by understanding how it can apply to real-world business problems with the aim of using it by 2023 or 2025.

Originally Posted On: aitp.org

As the lead for information security at Chicago Public Schools in 2013, Edward Marchewka wanted a way to measure how well the nation’s third largest public school district was doing at protecting its sensitive data.

Marchewka couldn’t find a model he liked, so he built one. It didn’t take long for him to see that there was a market gap for aggregating IT and information security metrics – one that he was well-positioned to fill. In 2015, he formed CHICAGO Metrics™, a platform that helps companies tell a better story by managing their key IT and information security risks.

Starting your own IT consulting business can be both enticing and intimidating. You exchange a corporate safety net for flexibility and autonomy. See below for tips on how to make that transition a success.


When you start an IT consulting business, you’re no longer solely focused on your area of IT expertise. You’re also in charge of project management, bookkeeping, contracts, legal matters related to starting a business, and potentially, employees.

“There’s a lot of behind-the-scenes activities with running and growing a business that most people don’t see and know unless they’re doing it,” says John Sterrett, founder of Austin, Texas-based Procure SQL, which specializes in SQL Server solutions.

He adds that as a consultant, employability skills – such as an ability to communicate with clients – can be just as, if not more, important than technical skills.

“If you’re going to be a consultant, at the end of the day, you’re selling hours,” he says. “You’re trying to help people. That’s how you’re staying in business. Being able to be quiet and listen can be very, very critical. I think even more so than the tech skills.”


Sterrett recommends having a business plan. It doesn’t need to be a lengthy, formal document, but writing down goals – and the milestones you need to hit to meet those goals – can go a long way.

That plan should include a period of months where you may not be able to pay yourself a salary, thanks to early startup costs and other growing pains, such as building a customer base. Make sure you’ve got at least three to six months of living costs saved before embarking on a consulting business, recommends Sterrett.

Be prepared to face some failure. Some early hiccups can be a good opportunity to refine your IT consulting business.

“A lot of people are afraid of failure,” Sterrett says. “When running a business, it’s probably good to fail as fast as you can, so you can figure out what works and what doesn’t, and keep moving in the direction of your goals.”


You need to be able to differentiate yourself: whether it’s through your pricing, your bundle of services, your expertise, or the types of clients you serve.

“Knowing where you fit in and who your competitors are is really important in this space,” Marchewka says.

Finding the right price point for your services can be tricky. Do your homework on what competitors are charging, but it may take some trial and error to find your sweet spot.

Marchewka says he discovered he was turning off bigger clients who assumed his bargain basement prices meant the quality must be low.

“Don’t undervalue yourself,” he says, adding that he’s been able to take on nonprofits and governmental clients by offering discounts to applicants who show need.


Marchewka started his business with two clients in hand, but he says generating new business is one of the hardest parts of going-it-alone.

“I like having the solutions, but generating the business is hard,” he says. “It requires work.”

Rather than hiring an expensive sales force, Marchewka says he partners with other vendors who have complementary products, building clients through a word-of-mouth network. Sterrett recommends making sure you’re not relying on just one big client. Hedge your bets by diversifying your roster of clients.

Build your network at events and conferences. Sterrett says he meets people by speaking at SQL Server conferences. Marchewka says he nabbed a major client after someone from the organization saw him speak at an event.

“Don’t be afraid to get on a stage and present yourself as that expert in your space,” Marchewka says. “People will recognize that and follow up with you.”

Originally Posted On: techrepublic.com

Three jobs completely new to the IT industry will be data trash engineer, virtual identity defender, and voice UX designer, according to Cognizant.

With technology flooding the enterprise, many people fear the emergence of tech will  take over their jobs. However, tech like artificial intelligence (AI) and machine learning will actually create more jobs for humans, according to a recent Cognizant report. The report outlines 21 “plausible and futuristic” jobs that will surface in the next decade.

The 21 jobs follow three major underlying themes: Ethical behaviors, security and safety, and dreams, said the report. These themes come from humans’ deeper aspirations for the future of the enterprise and daily life. Humans want machines to be ethical; humans want to feel safe in a technologically-fueled future; and humans always dreamt of a futuristic world, which is coming to fruition, according to the report.

SEE: Artificial intelligence: Trends, obstacles, and potential wins (Tech Pro Research)

Some of the jobs on Cognizant’s list could spark life-long careers, and some positions might be more fleeting, said the report.


  1. Cyber attack agent
  2. Voice UX designer
  3. Smart home design manager
  4. Algorithm bias auditor
  5. Virtual identity defender
  6. Cyber calamity forecaster
  7. Head of machine personality design
  8. Data trash engineer
  9. Uni4Life coordinator
  10. Head of business behavior
  11. Joy adjutant
  12. Juvenile cybercrime rehabilitation counselor
  13. Tidewater architect
  14. Esports arena builder
  15. VR arcade manager
  16. Vertical farm consultant
  17. Machine risk officer
  18. Flying car developer
  19. Haptic interface programmer
  20. Subscription management specialist
  21. Chief purpose planner

Click  here for descriptions of all 21 positions.

The big takeaways for tech leaders: 

  • Emerging tech will actually create a whole new set of jobs for humans in the next 10 years, with some having more staying power than others. — Cognizant, 2018
  • The tech jobs of the future all follow three underlying themes that humans share: Ethical behaviors, security and safety, and dreams. — Cognizant, 2018

Originally Posted On: informationweek.com

When hiring gets tough, IT leaders get strategic. Here’s how successful organizations seize the experts their competitors’ only wish they could land.

The technology industry’s unemployment rate is well below the national average, forcing companies to compete aggressively for top talent. When presented with a range of recruitment strategies by a recent Robert Half Technology questionnaire — including using recruiters, providing job flexibility and offering more pay — most IT decision makers said they are likely to try all approaches in order to land the best job candidates for their teams.

“We’re currently in a very competitive hiring market,” noted Ryan Sutton, district president for Robert Half Technology. “Employers want to hire the best talent to help keep their organization’s information safe, but so do a lot of other companies.”

Robert Half’s research finds that software development and data analytics experts are the most challenging to hire. Many other talents are scarce, too. “Some of the most in-demand skills right now include cloud security, security engineering, software engineering, DevOps, business intelligence and big data, as well as expertise in Java full-stack, ReactJS and AngularJS,” Sutton said.

What works

Finding qualified job candidates typically requires using a combination of strategies. But it’s also important to be able to move quickly. “At the core of the labor market now is a demand for speed and efficiency in the hiring process, but don’t confuse an expeditious process with a hastily made decision,” Sutton warned. “Some smart options would be to work with a specialized recruiter who knows your local market well; increasing the pay and benefits package to better attract a top candidate; and losing some of the skills requirements on your job description that aren’t must-haves to widen your talent pool.” He also reminded hiring managers to not underestimate the power of networking. “Let your contacts know you’re looking to hire for a certain position.”

Look beyond the typical sources, suggested Art Langer, a professor and director of the Center for Technology Management at Columbia University and founder and chairman of Workforce Opportunity Services (WOS), a nonprofit organization that connects underserved and veteran populations with IT jobs. “There is a large pool of untapped talent from underserved communities that companies overlook,” he explained. Businesses are now competing in a global market. “New technology allows us to connect with colleagues and potential partners around the world as easily as with our neighbors,” Langer said. “Companies hoping to expand overseas can benefit from employees who speak multiple languages.”

Companies need to explore different models of employment if they want access to the best and the brightest job candidates, observed Nick Hamm, CEO of 10K Advisors, a Salesforce consulting firm. “Some of the most talented professionals are choosing to leave full-time employment to pursue freelancing careers or start their own small consulting companies as a way to gain more balance or reduce commute times,” he advised. “If companies want access to these individuals, they’ll need the right processes and mindset in place to incorporate contract employees into core teams.” Using a talent broker to find the right experts, vet them and apply them inside an organization to solve business problems can alleviate many of the challenges people may now have tapping into the gig economy, Hamm added.

John Samuel, CIO, of Computer Generated Solutions, a business applications, enterprise learning and outsourcing services company, advised building some flexibility into job descriptions and requirements. “In this tight job market, a good way is to find candidates with the right attitude and a solid foundation and then train them in areas where they lack experience,” he said. Like Sutton, Samuel believes that many job descriptions are unrealistic, listing many requirements that aren’t core to the job’s role. “Rather than limiting your potential pool of candidates, simplify the job description to include your core requirements to entice applicants to fill open roles,” Samuel recommended.

Mike Weast, regional IT vice president at staffing firm Addison Group, urged hiring managers not to rely on software searches, no matter how intuitive they may claim to be, to uncover qualified job candidates. “There’s a lot of talk about using AI to find qualified candidates, but recruiters are needed to bridge the AI gap,” he claimed. “AI doesn’t qualify a candidate for showing up on time, having a strong handshake or making eye contact when communicating.”

Training current employees to meet the requirements of a vacant position is an often-overlooked method of acquiring experts. “It always makes sense to give existing employees the opportunity to expand their knowledge base and transition into vacant positions,” explained Lori Brock, head of innovation, Americas, for OSRAM, a multinational lighting manufacturer headquartered in Munich. “The roles within IT are merging with the traditional R&D functions as well as with roles in manufacturing, procurement, sales, marketing and more,” she added. “We can no longer consider jobs in IT fields as belonging to an IT silo within any organization.”

Last thought

It’s important to pounce quickly when finding a skilled, qualified job candidate. “Now is certainly not the time to be slow to hire,” Sutton said. “It’s a candidate’s market and they are well aware of the opportunities available to them.”

For more on IT hiring and management check out these recent articles.

John Edwards is a veteran business technology journalist. His work has appeared in The New York Times, The Washington Post, and numerous business and technology publications, including Computerworld, CFO Magazine, IBM Data Management Magazine, RFID Journal, and Electronic … View Full Bio