Contact Us


Home / Uncategorized

Proposed H-1B Visa Program Changes Spell Uncertainty

It’s nearly impossible to have missed political discourse regarding the Trump administration’s temporary travel ban and proposed changes to work visas in the United States. These changes could have broad implications and impact many people in various industries.

In January, President Trump and his team hinted at changing the H-1B work visa program, which allows skilled foreigners to enter the United States by working for companies, particularly in the tech industry.

Currently, the H-1B work visa program allows 65,000 skilled workers — with an additional 20,000 holding Master’s degrees — into the United States to fill specialty occupations including sciences, engineering, mathematics, medicine, technology, and other positions. Applicants must hold a bachelor’s degree at minimum.

Visa in passportApplicants can stay in the United States, if they’re working, for a maximum of six years. Many use this as a pathway to citizenship by applying for green cards at the end of their H-1B stays, allowing them to eventually apply for citizenship. However, this all might change: Recently introduced but not yet ratified immigration legislation and executive order drafts may impact H-1B workers.

According to the NPR article, these proposed changes are making tech industry leaders nervous: Hundreds of thousands of applicants already compete for the limited number of visa spots, and many industry leaders rely on this talented workforce segment to operate. Opponents argue the H-1B disproportionately favors foreign workers and does not put America first. However, many in the tech industry feel we need valuable employees and relationships with those from countries all over the globe to compete and remain relevant on a worldwide scale.

According to the CNN article, one introduced bipartisan H-1B reform bill may provide work visa program solutions while still allowing for tech and other STEM company leaders to hire on an international basis.

At Value Global, we’re committed to all our employees and clients around the world. We’ll be paying close attention to how the H-1B situation plays out in Congress and the court system as the conclusion will undoubtedly impact us all.

The consulting professionals at Value Global have the best talent in the industry available to provide you with innovative solutions for your business. Our technology experts can help integrate your proprietary and out-of-the-box software solutions, upgrade legacy systems, access data you can’t get to, create applications, and assist with your move to the cloud or provide a fully managed service solution. Learn more.

Value Global Announces Certification as a Texas Minority-Owned Business

Value Global, LLC, local office located Tomball, Texas, is pleased to announce the granting of certification status as a Minority-owned Business Enterprise (MBE) by the Houston chapter of the National Minority Supplier Development Council (NMSDC). The NMSDC is one of the countries leading corporate membership organizations and aims to help organizations meet their growing need for greater supplier diversity.

Being officially recognized as a MBE strengthens Value Global’s ability to meet the business needs of our partners and clients in Texas, as well as expand our innovative services to new industries and technologies. Value Global is honored to be a part of this vibrant and driven community.

Contact us to learn more about our innovative services and solutions!


Data in Translation

When you travel to a different country, how do you prepare? Making sure your passport is up-to-date is clearly crucial, as is packing the appropriate clothing. Maybe you change your phone plan to an international one so that you can communicate to your friends and family back home.

But what about tools for translation – how do you make sure your devices still work? How do you make sure you can understand the local language? Sure, these concerns aren’t quite as pressing or as obvious as, say, buying a plane ticket, but they will certainly affect your trip if you neglect to address them. Without them, your ability to function drops significantly.

In IT, data travelling between different applications or servers is like people travelling between different countries. Integrators and adapters are the conversion and translation tools that allow data to function and interact with the data that resides in “foreign” territory. We won’t get too technical here, but the following provides a fuller explanation of what integrators and adaptors actually are:


Integrators are software or middleware applications that integrate data between 2 or more business applications. Every company has multiple business applications that provide various functionality or tools for different users or departments. Despite the fact they are designated to perform different tasks or provide different functionality, these business applications must be able to “talk” to each other in order to streamline operations and provide a holistic snapshot of the business. Integrators “translate” the programming language of one application to another, which allows them to understand the data in their own context. Integrators also help to increase data security and scalability.

A common and simple example of an integrator would be software that allows an Accounting application to pull and apply data initially plugged into a Revenue application; the departments perform different functions, but pull from the same pool of information.


Adaptors perform a function similar to integrators – technically speaking, they are extended code snippets built right into integration software – but they work between 2 or more servers. Adapters help determine how to read, write, and move data between applications on different servers. Like power converters, adaptors provide the tools to transform data from a source and make it accessible for another user.

Some examples of adapters are:

  • File Adapters: read/write files to or from one server to another
  • FTP Adapters: move files to or from one server to another
  • DB (Database) Adapters: read/write files to or from a particular database
  • Queues: determine how to prioritize and process data that is received from multiple locations. Queues help to handle data and keep it organized even if the business application is down by maintaining the data in place until the application is running and ready for integration.

So what does this process actually look like?

Let’s say a business user in Austin wants to upload/move some data into another application that is running on Houston server. As step one, business users generally are asked to keep files in a shared location on their server. Once the file is saved in the appropriate place on the Austin server, the integrator software (application) grabs the file using one of the adapters mentioned above (in this case, the FTP adapter) and moves the file to Houston Server. After that the integrator application uses another adapter (in this case File Adapter) to read the contents of the file and write it into the database (using DB Adapter) of the application that is on Houston server. Any errors or exceptions during this process can be sent as email notifications to the business user indicating whether or not the file has been successfully uploaded.

Building your IT infrastructure without adaptors and/or integrators is like traveling to a foreign country without translation tools.  

It’s certainly possible, but it creates unnecessary hurdles and workarounds for your users, resulting in lost productivity and an increased risk of human error. In one case, a Filenet-Oracle integration project saved a mid-size Oil and Gas company $3.2 million dollars per year in operational costs and lasted only 3-4 months! Interested in how integrators and adaptors can improve your infrastructure? Contact us to learn more!


Why IT Should Be an M&A Power Player

Scenario: After decades of operating under one model, a company’s strategy changes; the board, the executives and the market forces in a grand alliance decide to shift the operations in a new direction. Will they become, smaller, nimbler, more focused? Or will they expand their reach, conquering new markets or business verticals? Once the decision is made, the strategy is laid out and a roadmap starts to evolve. Throw a buyer or a seller into the mix and we are witnessing the birth of a brand new enterprise!

Merger and acquisition (M&A) or divestiture periods are undoubtedly disruptive and chaotic. Almost all organizations will struggle with the sequence of priorities because people, processes and technology are inextricably intertwined; splitting them apart or rearranging them inevitably causes a breakdown in some part of the value chain. Core operations do not work without people; people do not work without information systems; information systems do not work without technology; and technology does not work without core operations. This cycle is difficult to break, and it makes organizations incredibly hesitant to tweak any part of the process.

However, within this chaos and disruption comes a tremendous opportunity for transformation and bringing IT into the strategy at the beginning will help anchor the strategy in the most relevant data and processes. IT lies at the intersection of the Venn diagram that represents both the old organization and the new; the data remains the same, but systems and processes – once identified – can be refined, updated and streamlined. Business strategy and processes as a whole impact the efficacy of IT, so it makes sense that IT should have a voice and representation at the discussion table. A clear understanding of how information flows through an organization will mitigate problems – ranging from a small glitch to a full blown breakdown – from plaguing this incredibly formative time for the new enterprise.

IT also has a broader understanding of:

  • Relevant elements of the M&A or divestiture, and how they map to technology systems
  • Expectations for the department during the transformation period, including:
    • Cutting costs and minimizing the stranded cost burden. Once processes and goals are defined, IT system integration can aid in streamlining effort by reducing redundant or convoluted processes within the infrastructure
    • Estimation for staff utilization before, during, and after transition
    • Understanding the regulatory and tax requirements, both to determine the time allotted for transition and to determine the data that can be shared and the data that needs to be protected. In the case of a divestiture, the separation may spring one publicly traded company from another and it is likely to have severe data protection clauses for the parent company with which IT will have to comply
  • Qualitative elements of the buyer-seller relationship, such as the differences between buying or selling assets with an equity firm and a strategic buyer or spinoff that may be a competitor or collaborator

Regardless of the industry or model the buyer and seller agree to trade in, careful and deliberate planning will expose opportunities for the CIO and allow him/her to capitalize on these to provide an enhanced value to the business. Among them:

  • Use IT to streamline and corroborate the deal
  • Simplify the IT landscape and aim to reduce fixed costs
  • Seize the current disruption to redefine or add to the IT framework with newer, more agile technology (i.e., Cloud, IoT, AI, Data Analytics, Big Data, or open frameworks)

M&A or divestitures are a stressful, uncertain time for anyone involved. However, careful planning and stakeholder engagement can result in a transformed enterprise that runs faster and moves quicker. The input of the IT department is invaluable to this goal. The disruption and blurring of sector lines is forcing companies to anticipate challenges to their business models. IT can no longer be considered as a support activity, but must stake its claim at the discussion table as a stakeholder in the decision-making process.

Value Global will be attending P2 Energy’s ASCEND Conference in San Antonio, Texas. Managing Principal Shree Sannabhadti will be giving a presentation entitled “Anchor with Land”, detailing our proven methodology for IT to aid Oil and Gas entities during mergers, acquisitions and divestitures. We hope to see you there! 




Optimized Process, Optimized Business

Optimized process make for an optimized business. Makes sense, right? But Optimization’s powerful foe is the Status Quo, which relies on inertia and a “well, we’ve always done it this way” attitude” to halt change and stifle innovation. In today’s competitive and fast-moving environment, however, businesses can’t afford to hesitate or lag behind in adapting to the marketplace.

Business Process Management (BPM) is a discipline that involves the design, analysis, monitoring, and control of these processes. With BPM, processes can become more transparent and therefore easier to monitor, which in turn helps to identify and correct any bottlenecks or problems. With BPM, the time and effort required to make necessary process modifications can be minimized. In addition, optimized business processes can help efficient and effective utilization of resources resulting in improved performance.

Implementing a Business Process Management Suite

The easiest way to improve process performance is to leverage best-in-class technology and automate it as much as you can. Typically, CRM and ERP applications have workflows to manage some processes or portions of process. However, these often operate within the bounds of the application and are confined to business functions such as sales, manufacturing, finance, etc. They do not span multiple applications or departmental boundaries, and can therefore disrupt the business as a whole through fragmented data or disparate operations. Traditional methods to bridge these gaps often result in a clumsy mix of manual steps, isolated tools, or applications that can be cumbersome to manage and control. However, a new breed of software has evolved in order to correct precisely these problems and allow companies to integrate their processes quicker and more efficiently, known (most uncreatively) as a Business Process Management Suite.

Essentially, a BPM Suite is a toolbox that helps business to:

  • Design: provides graphical modeling tools to visually layout steps and flow of a business process.
  • Execute: provides a process orchestration engine to execute modeled processes. Short as well as long running processes are supported, ranging from a few minutes to days or more.
  • Monitor: captures and screens processed data points. This enables oversight of process health, metrics, audit trail, etc.
  • Analyze: measures the efficiency of the process, areas for improvement, and any potential bottle-necks. It is also possible to run simulations on a business process using mock data, which allows you to test and analyze changes to or variations of a business process without actually implementing them.
  • Integrate: provides connectivity to databases using standard mechanisms such as JDBC and to other applications through integration technologies such as SOAP or REST. Out-of-the-box integration with popular commercial off-the-shelf software may also be available.

The primary benefit of BPM is that it facilitates collaboration between business and IT teams. The two can work together to design, implement, and improve business processes. This ensures better alignment between what business needs and IT delivery. Additionally, For organizations that have implemented Service Oriented Architecture (SOA), BPM is like the icing on the cake. BPM can leverage services to quickly build out business processes and ensure that benefits of SOA can be fully realized. And for organizations thinking about implementing SOA, BPM is a great enabler, as business users are more likely to understand and approve the business case for BPM than SOA.

Choosing the BPM Suite That’s Right for You

There are number of vendors with BPM products, including large enterprise software companies such as IBM, Oracle, Tibco, and Software AG. There are also pure-play companies like Pegasystems, Appian, and Aurea who have a BPM Suite as their flagship product. Each of these products has its strengths and weaknesses and so, as with any other software, it’s extremely important to choose the one that fits your needs best. Among other factors, it’s crucial to:

  • Identify a few key processes that you would like to automate. Some products are better suited for certain types of processes than others.
  • Identify other systems that may be involved in these processes.
  • Analyze integration capabilities. Even though these products often provide drag-and-drop tools to build processes, you will need customizations and extensions in the long run.
  • Select the proper technology stack. For instance, if your company largely runs on Java, you will likely be better off with a product allows customizations using Java, rather than a product that requires using .Net.

We recommend engaging a seasoned, trusted vendor to go through a system discovery and analysis process to get all of the information necessary to choose a BPM product. Once the product has been selected, start with a small, well-defined process- ideally, it should be one that can alleviate some pain-point and can clearly demonstrate measurable ROI. You’ll be amazed at the difference achieving a few quick wins with BPM can do for your business, your resources, and your bottom line.


Azure vs. AWS: Comparing the Best of the Best

If you’re looking to migrate your ERP system to the Cloud, you’re in luck – you’ve got a ton of options. But between Google, Salesforce, AWS, Oracle, and Azure, this same amazing opportunity for choice can cause decision paralysis. With so many trusted industry players getting in on the Cloud game with fantastic products, it’s hard to know which option is truly the best for your enterprise.

While we’re not going to break down every Cloud option for you (seriously, the list is way too long!), we are going to walk you through a comparison of two of the biggest, most prominent options: Microsoft Azure and Amazon Web Services.

We’ve previously broken down the basics and features of AWS on our blog, so before we get into the comparison, let’s take a minute to go through Microsoft’s cloud star: Azure.

About Microsoft Azure

Since its release in 2010 as “Windows Azure’, Microsoft’s comprehensive Cloud offering has consistently been a leader in the IaaS space. Azure can easily integrate with almost any existing IT environment and supports several programming languages. Its 50 service offerings ensure a plethora of options and customization for businesses.

Like AWS, Azure runs on a network of data centers that span across 22 regions, each responsible for a specific location. Also like AWS, Azure offers extensive predictive analytics services, such as machine learning, Cortana analytics, and stream analytics. These allow data from the Cloud to be organized and synthesized into actionable intelligence for businesses.

Azure is an extremely popular Cloud choice for large, established businesses, with more than 66 percent of Fortune 500 companies relying on its services. The Azure cloud integrates seamlessly with a huge variety of infrastructure, operating systems, programming languages, frameworks, tools, databases and devices. While Azure is not a completely open-source platform, you do have the ability to: run Linux containers with Docker integration; build apps with JavaScript, Python, .NET, PHP, Java and Node.js; build back-ends for iOS, Android and Windows devices. As the result of a deal with Oracle, you can also deploy Oracle-based software on an Azure platform.

As part of their built-in support, Azure offers enterprise grade SLAs on services and 24*7 health monitoring and tech support. To ensure the highest level of security for their users, Azure built the largest network of secure private connections, data residency and encryption features.

By now, you’re probably thinking, “That sounds great! But also Amazon sounds great! This is extremely unhelpful. What was the point again?”

Trust us, we hear you. Both of these options are best in class products from companies that have defined and dominated the tech world from 20 years. It’s not an easy choice, but ultimately, you really can’t go wrong. When it comes down to it, AWS advantages are:

  • Ease of use, intuitive management dashboard and APIs
  • Massive scale and cutting-edge features
  • Resource availability

Azure, on the other hand, offers:

  • Better, bigger options for multi-cloud backup
  • Superior hybrid-cloud offering
  • Seamless integration with Microsoft and Oracle products

If you’re preparing your company for a Cloud implementation and migration, the most important thing you can do is your homework. Analyze your KPIs and determine what’s most important to you. Understand what your company requires from its cloud infrastructure now and in the future so you get a better ideal of your scalability needs. Know what your current systems look like and figure out how your Cloud needs to work with it.

Most importantly, select a vendor who can help your team throughout the entire process – from selection to implementation to adoption – to ensure minimal disruption to your day-to-day operations and maintain the highest integrity of your data. 


The Everything Cloud

In some senses, Amazon has come a long way since its inception as an alternate to the brick-and-mortar bookstore in 1994. Even Jeff Bezos, who declared that he wanted to turn Amazon the biggest store in the world, did not predict the degree to which it would surpass his vision. The company has not only grown exponentially, but completely transformed the way people shop…for anything. So perhaps it makes sense that their biggest, most profitable service ever is another startling disruptor. And no, we’re not talking about 2-Day shipping (although we probably won’t argue with you either). We’re talking about Amazon Web Services (AWS), which provides on-demand computing resources and services in the cloud for anyone, with pay-as-you-go pricing. AWS offers a broad set of services that help you move faster, lower your costs and scale your applications.  The combination of a wide customer base and agile, abundant features makes AWS one of the most popular cloud providers in the industry. In fact, Gartner placed it at the top of its Cloud Infrastructure Magic Quadrant in 2015, with customers deploying an estimated 10 times more infrastructure on AWS than the next fourteen providers combined.

At its most basic, AWS to make it easier to build and manage your websites and applications, allowing you to:

  • Host a static website, which use technologies such as HTML, CSS and JavaScript to display content that doesn’t change frequently
  • Host a dynamic website, or web app
  • Process and store data
  • Handle peak loads (for high-traffic websites)

Security in the cloud is always one of the number one concerns for new users or companies. AWS provides a secure global infrastructure with 12 geographical regions for servers, and multiple service centers per region; it also has a specific “GovCloud” for US Government customers.  In addition to the containment benefits provided by these regions and centers, AWS offers a range of security features and options:

  • Access to AWS data centers and network is strictly controlled, monitored, and audited
  • Security credentials are tightly managed and monitored
  • ACL-type permissions can be applied on to your data, as well as encryptions for data at rest
  • Virtual private clouds (VPC) can be set up to be isolated from other virtual networks
  • Operating systems can be controlled and configured to your specifications
  • Security groups, which acts as a virtual firewall for all traffic, can be set up
  • Login information can be encrypted

Because AWS was intended for use by both individual consumers and companies with a need for more sophisticated infrastructure, its benefits are diverse and include:

  • Designed to allow application providers, ISVs, and vendors to quickly and securely host your applications
  • Reliable, secure, and global infrastructure
  • Scalable and high-performance applications can be adjusted based on your needs
  • Built-in flexibility enables you to select the operating system, programming language, web application platform, database and other services
  • Tiered pricing: only for the compute power, storage, and other resources you use, with no long-term contracts or up-front commitments – and the first option is FREE!
  • Huge variety of products and professional services

On top of all this, AWS committed in 2014 to 100% renewable energy usage, partnering with Iberdrola Renewables, LLC, EDP Renewables and Tesla Motors to achieve this goal. So whether you’re a company looking to become more agile or an individual searching for a place to host your personal blog, take a look at what AWS has to offer. If you’re still not sure about making the leap, Value Global has a full range of cloud services to help guide you towards the right solution.  And, after all, you rely on Amazon for everything else; why not the Cloud, too?


Building Your Legacy in the Cloud

Because so many people are still unsure of what exactly “the Cloud” is, we tend to think of it as a buzzword, a brand new addition to the world of IT. In reality, cloud computing has been around for the last 45 years, with the first documented usage in the 1970s. In fact, you may not have realized it, but the first time you personally accessed the Cloud was through that Hotmail or Yahoo account you signed up for in 1996!

Of course, today’s age of cloud computing is much more sophisticated and has a significantly wider influence across all industry sectors. Especially in the last few years, the impact of the cloud has spread leaps and bounds from the IT systems of cutting edge companies to our almost mundane interactions from on our personal phones.

The current cloud landscapes can be broadly classified as:

  • IaaS – Infrastructure as a Service, which provides users with resources over the Internet;
  • PaaS – Platform as a Service, which provides a development environment for applications, and;
  • SaaS – Software as a Service, which allows users to gain access to applications and servers.

The cloud can be further broken down into categories such as: Private Cloud, Public Cloud and Hybrid Cloud.

This litany of options can make it difficult for corporations to determine which solutions best fits their goals and budgets. In determining a cloud strategy, it’s important to:

Know Your Might

  • As with any other implementation, it is crucial to focus on long-term ROI, not on the initial cost. In determining which cloud strategy is going to most enhance your organizational capabilities, consider the overall benefits and cost-savings that will empower your business most down the road.

Know Your Landscape

  • Spend some time analyzing and breaking down your current IT landscape from all angles. Whether you’re a company that has a large central infrastructure or one that is broken into several smaller environments, make sure you know how exactly how all of your environments intersect and interact.

Know Your Benefits

  • Consider all of the pros and cons for each cloud option, including: flexibility, cost effectiveness, data security, data integration, data mining and analytics options, and business intelligence capabilities.

Know Your Strategy

  • Once you’ve determined your needs for IaaS, PaaS and/or SaaS, as well as the right mix of Hybrid, Private or Public Clouds, make sure you’ve laid out a complete roadmap for transformation, including milestones, and stick to it. Consider engaging a trusted partner to manage the migration or perform some of the heavy technical work

ERP to Cloud Solutions

These guidelines address most of the general issues and questions that arise in implementing the Cloud, but there are specific challenges that for ERP-to-Cloud migrations, a situation that Mark Hurd of Oracle called, “a not-so-classic case of several irresistible forces meeting a movable object”. Established companies are often entrenched in their legacy systems, and it can be particularly challenging for them to choose a cloud solution that preserves the integrity of their existing applications while enhancing and bringing them up to date with the new infrastructure. For ERP-to-Cloud migrations, be sure to consider:

  • Flexibility vs Stability: Traditional ERP systems have often been modified to suit specific business needs and processes. A cloud-based ERP is much more stable, but much more rigid. Knowing your organizational needs and future initiatives is crucial in prioritizing these infrastructure qualities.
  • Business Process Reengineering (BPR): Rigorous BPR prior to your migration will ensure that any application, process or modification no longer serving your organization is eliminated, allowing you to start fresh with a more productive and less redundant IT environment.
  • Integration Needs: Migrating disparate systems into the Cloud can be tough, but investing in integration work during your migration can alleviate many of the issues that might arise and ensure a smooth, streamlined transition.
  • Data Security: Data security used to be a huge concern when migrating to the Cloud, but it is rapidly declining with maturing in public cloud ERP. Private cloud solutions might be the best option for smaller corporations with more urgent data security needs, while a Hybrid Cloud can allow large companies to choose what information they store in their cloud environment.

For one cloud engagement, we relied on heavy integration work with Oracle ERP, ADP and Banks for a Workday implementation. During this process, we also invested significant time for our team to conduct BPR. As a result, we were able to identify the additional development and maintenance necessary for a fully implemented solution, for which we utilized Oracle SOA Suite.

While it may seem that these migrations require considerable technical “heavy-lifting”, the long-term benefits are well worth the investment. If you have specific questions or concerns regarding your company’s migration, contact us!

Stay tuned for out next blog post, in which we will take a deep dive into the features and benefits of Amazon Web Services! 


Big Data and the Big Dance

The NCAA Tournament is famous for its unpredictability. This year, the University of North Carolina, the University of Virginia, the University of Kansas and the University of Oregon are the top teams vying for tournament glory, but the odds are that at least one of these teams will be taken down by an underdog before reaching the Final Four. Every year, sports pundits and fans alike eagerly wait to see who the next “Cinderella story” will be, who will take up the mantle held by teams like Butler in 2011, George Mason University in 2006 and North Carolina State in 1983. In fact, since seeding began in 1979, only once – in 2008 – have all four top-seeded teams made it to the Final Four.

It is precisely these unexpected twists and turns that make March Madness one of the most popular events in sports, with millions of people filling out brackets to join office pools, enter national tournaments and compete with their friends to see who can predict which of the 68 eligible teams will come out on top. Unfortunately, statisticians estimate the odds of correctly predicting the outcome of each game in the tournament to be one in over nine quintillion.

Yes, you read that correctly…one in over nine quintillion. To date, no perfect bracket has ever been verified, a fact of which fact Warren Buffett was probably well-aware when he  offered a billion dollar prize for one in 2014 (it was a pretty safe bet; no one who entered the competition made it past the 2nd round with their bracket still intact). To try and make sense of the madness, fans and coaches alike are turning to Big Data analysis to not only build the perfect bracket, but the perfect team. In fact, March Madness has been called “America’s most popular exercise in statistical reasoning”.

Before the Big Dance even begins, coaches analyze massive amounts of data they gather through technology such as Sports VU, cameras in arenas that record and store raw information at 25 frames per second. This data, in turn, helps them determine probable outcomes for match ups or predict how a certain player can be expected to perform under pressure. This analysis has become such a powerful tool for a team that former Butler University Coach Brad Stevens (now head coach for the Boston Celtics) employed a full-time statistician on his staff to help him decide his starting line-up and create optimal player combinations.

It doesn’t seem like much of a stretch to apply the extensive analysis gleaned from the court to predicting tournament outcomes. The Dance Card, a formula developed by Jay Coleman, Mike DuMond and Allen Lynch, has a 97% success rate of predicting which teams will receive at-large bids into the Tournament, choosing 141 or 146 teams over the last four years. However, success rates for predictive analysis based on Big Data drop sharply after the initial bids. Even with the extensive availability of data, identifying the right formula for predicting the NCAA tournament continues to elude; Nate Silver, the incredibly successful baseball statistician who was able to correctly forecast the winner of each state in the 2012 presidential election, only achieved a 33% accuracy rate for his 2014 tournament predictions.

In pursuit of the perfect algorithm, Kaggle, a leading platform for predictive analysis and data modeling, challenges data scientists with March Machine Learning Mania. The 600+ competitors not only build and test their predictive models against past tournaments, but they are also required to provide a quantitative measure of confidence for their predictions to ensure a scientific approach and mitigate the “lucky guess”. The 2014 winners combined a formula that analyzed teams’ performance per possession with Las Vegas betting odds, which incorporate intangible factors such as injuries and home-field advantage.

In the end, most of the estimated 70 million brackets will be created through some combination of data analysis and pure gut-instinct (or perhaps, in some cases, blind hope). If you do decide to rely completely on a predictive data model, Will Cukierski, a competition administrator at Kaggle, suggests utilizing several techniques at once, or “stacking”. Merging a formula that focuses on better seeds with one that focuses on defensive prowess, for example, could give you a powerful advantage.

Then again, you might just be better off choosing teams by mascot.


3 Key Elements of Successful Global Teams

Whether nearshore or offshore, through in-house hires or through a capable partner, nearly all IT departments enhance their capabilities by engaging a remote team. Some of the most common reasons underlining the need for a distributed workforce are:


  1. Cost – rates for a blended delivery team are much lower than traditional onshore rates
  2. Extended Coverage – allows the work to continue off-hours (nights, weekends, vacations, holidays) without sacrificing a work-life balance for your onshore team
  3. Risk Mitigation – continuous coverage significantly raises the odds that small issues will be caught and corrected before they become big problems
  4. Scalability – easily add to your team during gaps in coverage or high-volume periods


Globalization has brought a world of resources to our fingertips. Innovations, talent and information previously out of reach are suddenly available with the click of a button. It’s made engineering complicated solutions not only possible, but delivering them easier.

It’s also made managing your team much, much harder.

As a manager, it’s your job to ensure that each member is properly utilized and performing well. In addition, you must be constantly aware of individual, sometimes competing, personalities and goals. This can be difficult when your team members are in the same spot – even sitting right next to each other! But with the literal expansion of the modern work environment, managers must be able to succeed with teams in different offices…and different time zones…and different countries.

Value Global consultants have invested countless hours in figuring out exactly how to ensure success with this “blended delivery model”. As a result, we’ve identified three crucial puzzles pieces that every successful distributed team needs: Compatibility, Communication and Ownership.



  • Select a partner (not a vendor) with relevant industry experience, of a compatible size to your organization, and who comes with trusted references. Consider using one partner for multiple needs, which will minimize the initial learning curve and increase long-term collaboration.
  • Evaluate and select team members in the same way that you would members of your own team – think of this partner as an extension of your organization and you’ll get the right people for your needs.
  • Get to know your new team members! A video conference is a great way to make the introductions and being able to put a face to the e-mail address will make a huge difference.



  • Take time upfront to train your new team and document all necessary procedures (code check-ins, unit tests, QA testing, communications, troubleshooting, escalation, etc.). Be available for them to ask questions and provide feedback, especially during the first few weeks of delivery.
  • When you determine shift times, make sure the US-Remote shifts overlap to accommodate for any issues that require a collaborative effort or instant communication.
  • Hold daily “standing” calls to ensure transparency and productivity. Consider creating a digital team portal where members can post comments, questions and thoughts throughout the day.
  • Schedule monthly (at least, more often if schedules allows) service review and make adjustments based on feedback from both the US team and the remote team.
  • For any application development projects, be sure to hold demos every 1-2 weeks to allow for timely adjustments and feedback.



  • Identify a person responsible for remote delivery both onsite and at the remote location. If new responsibilities are needed, communicate these needs through the POCs to make sure the team has the bandwidth to deliver.
  • Select a work management system with ticketing – use this to track assignments with estimates and due dates, as well as to document decisions.
  • Maintain realistic expectations and adjust accordingly – a remote team can’t compete with the onsite resources who have an in-depth knowledge of your business, as well as direct access to business users and decision makers. However, with guidance and patience, this kind of business intelligence will increase exponentially over time.


We’ve applied these guidelines to project teams for anything from large implementations to micro-application development; we’ve consistently been able to deliver faster and more efficiently. Questions about our enterprise services and how your organization can benefit from blended teams? Contact us!

1 2 3
Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from Youtube
Consent to display content from Vimeo
Google Maps
Consent to display content from Google
Consent to display content from Spotify
Sound Cloud
Consent to display content from Sound