BitDepth#967 - December 16

Regulators consider Columbus buyout
Representatives of telecommunications operators take questions from moderators that originated from the audience at Wednesday’s meeting of regulators. Photograph by Mark Lyndersay.

Over the course of two days, specifically last Wednesday and Thursday, regional telecommunications regulators met at the Cascadia Hotel to consider the proposed buyout of Columbus Communications by Cable and Wireless (CWC).

The opening session offered operators in the telecoms market in T&T an opportunity to state their positions on current issues and to offer statements on the impact of the consolidation of these two major business interests in the region.

Held under Chatham House rules, attendees were allowed to make use of the information presented, but not to attribute it to a source or speaker.
Since then, Digicel has released the full text of the speech given by its Chairman,
Denis O’Brien to the regulators in attendance.

Despite early calls for “earnest and productive discussions,” at “a watershed event,” a meeting described as “a catalyst for justice,” O’Brien’s speech captures much of the spirit of ruthlessly polite sniping that was prevalent between the major players at this critical showdown.

Described as “an incredible event,” it was observed that “never have we had so many people together in one room discussing what we’re discussing.”
At issue, ultimately, is “not how many players are in the market, but how those players compete to the benefit of the region’s customers.”

Clarity was called for in the matter of mergers and acquisitions; the prior fuzziness thrown into sharp relief by a case unprecedented in its scale in the region.
The meeting was seen by most operators as a critical opportunity to streamline the telecommunications regulatory regime in the region, which is today overseen by 20 regional regulators who are tasked with guiding the operations of two multi-national players.

In at least one presentation, the history of cable signal piracy was aired in disturbing detail.
In another, the challenges that Barbados faces as a nascent competitive environment for telecommunications were explained as that country seems set to collapse into a monopoly position again.

Other statements questioned the value of evaluating the future by the standards of the past.
While cabled solutions are likely to remain dominant as the source of extreme high bandwidth speeds for at least another decade, advances in technology look set to reduce the need for cables to the residence and SME business in favour of newer, faster technologies.

It was also argued that the region’s market for telecommunications solutions though small, is under penetrated.
Making regulatory decisions for the situation as it stands at the moment will be challenging enough, but regional regulators face a situation in which the speed of technology advancements will bring into question even the most sane and sensible decisions that are likely to emerge from last week’s discussions.

The back and forth over expensive infrastructure investments also points to other possibilities.
Should the nations of the region seek to begin building national infrastructure to support telecommunications, creating and siting transmission towers and running cabling to support long term competitive investment by telecommunications businesses while optimising and consolidating the necessary intrusions that such technologies impose on our small and attractive landscapes?

There is already a model for that, and it’s one that’s worked well for both the National Gas Company and for Trinidad and Tobago’s natural gas industry.

These and other issues, particularly the thorniest ones that seek to ensure a level playing field for both the two established multinational telecommunications companies provide access points for future entrants into the business and address the concerns of the many smaller local players operating in smaller niches of the now-standard quadplay business approach need to be clarified quickly and clearly.

This is tricky, often region specific business, particularly in the five countries described as “overlapping markets, St Lucia, Grenada, Jamaica, Barbados and T&T. But as all the operators made clear on Wednesday morning; these decisions will reverberate throughout their business and on through the next decade of telecommunications development and investment.

BitDepth#966 - December 09

Managing large image datasets
Adobe’s Terry White listens to David Vashevitch as he discussed asset management as part of a seminar on visual storytelling at PhotoPlus Expo last month. Photo by Mark Lyndersay.

Nobody like to talk about backup much, I’ve found, and quite recently, I discovered that nobody likes to listen to advice about it either.
And if you think backup is a sour discussion, wait until you start to talk about archive access for a very special kind of photographer glaze to set in.

I participated in a local panel discussion about backup for photographers a couple of weeks ago and despite a lot of talk about the importance of the subject, all it took was a bit of rain to dampen everyone’s spirits and for the topic to get tossed on the backburner again.

This isn’t unique to T&T, unfortunately.
At last month’s PhotoPlus Expo in New York, the topic arose in several sessions, most notably in a preshow discussion titled How technology is leading the storytelling revolution.

During that session, Adobe evangelist and photographer Terry White noted that “We need to manage the process of online archiving and access.”
“I’ve lost galleries posted to photo sharing sites that have expired or simply aren't in existence anymore.”
“I think photographers are having a bad time adjusting to all the things that are happening,” said Time’s Phil Moakley.

Time is investing in technologies that allow it to more efficiently mine user-generated content and to more easily contact users on the spot at breaking news events.
But it was David Vashevitch, former CTO at Microsoft who best summed up the situation for today’s photographers.

“Digital photos may last longer as bits because they are copied to many different devices,” he explained, “but there are issues of confidence in trusting a company to outlast your photos.”
“If you want a service that respects your need for long-term archiving, I don't think that exists now. There is also the challenge of tunnelling into a mass of photos to find specific stories.”

Vashevitch has a vested interest in the problem. He’s involved with
Mylio, which had a big presence at this year’s expo and is the first major new entrant into the field of archive access in more than a decade.
During that time, archive access had determinedly moved upmarket into enterprise. Extensis Portfolio, one of the earliest entrants in the market for digital asset management killed its single-user product in favour of business class offerings.

Canto’s Cumulus has always been focused on the vertical market and the sole product remaining in the archive access market place is
Media Pro One, which after being through several owners appears to be languishing at Phase One.

Most photographers try to repurpose browsers like Bridge or parametric image editors like Lightroom to serve as asset managers, but they aren’t designed for that purpose and only work well with smaller datasets.
Move up to the terabyte class with hundreds of thousands of files and their capacity to scale begins to break down, all without offering any of the key features of a true digital asset manager (DAM).

A DAM tool creates its own catalog of low resolution thumbnails of any digital asset it is built to manage, while allowing user browsing, sorting, organisation and metadata input (keywording, copyright information) without access to the master files.

Mylio manages this by creating a local database as well as creating an online backup of photographic assets, which are then available for browsing and photographer management and adjustment across a range of devices.
For Mylio to succeed and for Media Pro One to get a new lease on life, there’s going to need to be a big sea change in the way photographers think about their growing image assets, but that has been slow in coming.

The idea of managing collections of work for relicensing is one that takes time to set in and for most of today’s photographers, it simply hasn’t been a big issue on their radar, which seems set almost permanently set on flashy lighting techniques, stylish post processing and hot retouching procedures.

That’s unfortunate. Because when the need finally hits, there may be no products on the shelves left to meet it.

BitDepth#965 - December 02

Creating our own narratives
Photojournalist Rick Smolan explains the evolution of his new project, Inside Tracks at PhotoPlus Expo 2014.
Photo by Mark Lyndersay.

Whenever I attend a PhotoPlus Expo, I nose around a bit trying to find out what the show’s organisers are pursuing as a theme for the show.
It tends to emerge through the speakers and their choice of subjects most clearly rather than as any clearly articulated statement.

At past expos, video on DSLRs, search engine optimisation and shrinking markets have emerged as headline subjects.
Many of this year’s speakers seemed resigned to the reality that the editorial and documentary market are permanently changed by the new distribution models emerging on the Internet and conversations were focused on how to make the most of the new opportunities.

At a symposium titled “Cross Media Projects: A New Path for Visual Storytellers,” four early movers in the space specifically discussed their experiences in the evolving business of developing image-based stories.
The veteran of the space, the still disturbingly youthful Rick Smolan, discussed the rejuvenation of his 1992 book project, From Alice to Ocean, itself an amplification of an earlier National Geographic story about Robyn Davidson, a young woman who decided to go walkabout in the Australian outback with four camels and a dog.

The new book, Inside Tracks, is now being revamped and embedded with technology in support of a film version of Davidson’s book, Tracks. Image recognition triggers video playback on a smartphone or tablet, linking a reader to related clips from the film.

In many ways, From Alice to Ocean was the smallest of Smolan’s book projects, a precursor to his bookshelf breaking series of Day in the Life books which brought together a star roster of photographers to document seven countries and the state of California, each over the course of 24 hours.

Other speakers included Jessica Dimmock, a young photojournalist and writer and filmmaker Julie Winokur, the wife of photojournalist Ed Kashi who began by expanding his work into a series of documentaries, then began working on her own projects.

Douglas Menuez spent years documenting Steve Jobs and the NeXT project, fascinated by the story of a young technologist, ousted from the company he founded to start again.
Menuez was on the verge of taking his story public with Time for a cover story when Jobs, for reasons that remain unclear to the photojournalist, killed the story.

The photographer recalls meeting Steve Jobs in a corridor at NeXT after learning the news and being reassured by him that “the photos would make a great story one day.”
It might well have been one of Jobs’ smaller predictions, but it turned out to be true. Menuez continued his documentation of technology companies, finding doors opening easily for someone who had earned the trust of the mercurial Jobs.

The collection is lodged at Stanford University, who called the photographer to oversee the scanning of the work. As he looked at the images, an idea began to germinate.
The result, an edit of 250,000 images later, is Fearless Genius, a gritty black and white document of the seminal years of Silicon Valley between 1980 and 1990.

Menuez has spun the project, which leverages a selection of 7,000 scanned photos, into a lavish book, a web app, a documentary, a television and web series as well as a schools education programme.
All the documentarians were upfront about the challenges of financing their projects.

Julie Winokur remembers saying of an early documentary short, “What do you mean we’re going to give it away?”
But such apparent largesse can prove critical in building attention and support for a project and others to follow.
All on the panel agreed that it was important not to mix personal finances with project finances, but most laughed ruefully when Menuez noted that, “but we all do it.”

Menuez has finances as well as his image collection invested in the development of the Fearless Genius project, which had an early boost of $2 million from a private investor to create the digital files needed to move it forward.
All acknowledge the need to pin projects on something current.

Fearless Genius has a component that examines today’s technological innovators. Smolan’s Inside Tracks uses cutting edge technology to link to the online videos. Many of Winokur’s projects look at headline issues like ageing and sharp political divides.

None of these creators are waiting for publishers or broadcasters to develop their projects, acknowledging that for passion projects or subjects lying just outside the mainstream of public interest, the only way build an audience is to accept the risks and build the projects. And just maybe, they will come.

Making a great Kickstarter campaign
You are selling a story, not a product
Make a killer video (short and sweet)
Offer killer rewards
Sweeten the rewards with digital goodies
Enlist your friends
Identify “Amplifiers”
Make your updates interesting and worthy of forwarding
With Kickstarter you can build a relationship with your customers

Planning Funding
Budget the entire project, detailing costs for each element.
Project revenues over the project’s life
Research and target sources of funding
Prepare pitch materials

BitDepth#964 - November 24

On Microsoft’s M4 Roadshow
Mariana Castro, General Manager, New Markets for Microsoft Latin America gives the keynote speech at last week’s Microsoft M4 Conference. Photo by Mark Lyndersay.

On Thursday, Microsoft T&T hosted a major local conference at the Hyatt Regency, with
a strong slate of speakers ready to explain the company’s offerings to a profile of visitors that ranged from IT professionals to Public Sector infrastructure planners.

But it wasn’t immediately clear what the company hoped to accomplish with the event, which trotted out an impressive phalanx of positive statistics and put several of its Latin American managers centre stage to reiterate the new cloud first, device agnostic positioning of its software and services
unveiled by new CEO Satya Nadella earlier this year.

Mariana Castro, General Manager of New Markets for Latin America (LATAM), got the ball rolling with an impressive reiteration of its refreshed market approach and offerings in the region.
Microsoft has had a presence in LATAM for 20 years now and has been delivering cloud based solutions for just as long. The company has invested US$2.3 billion in its cloud infrastructure, creating geo-redundant data centers at locations in North America, Europe and Asia.

Castro explained that the company has migrated from its previous model of licensing its software to one in which products can be turned off and on according to client need, rather obscurely described as “digital work and life experiences.”

The company has been leveraging the scalability of its cloud services to offer decisive advantage to companies, most notably small and medium enterprises (SMEs) and governments who have large spikes in their resource use.
That allows IT administrators to temporarily buy more support resources to serve temporary need.

The Government of Haiti is running their systems in Microsoft’s cloud in the wake of the infrastructural collapse there, the Sochi Olympics scaled their capacity to meet demand using the service and the Mexican tax administration office uses Microsoft’s cloud to manage more than 16 million invoices per week with peak surges that run as high as 147 million.

Barcelona hosts 1.5 million guests for its La Merce Festival and uses Microsoft’s cloud services to manage the surge in foot, bike, auto and public transportation demand that ensues.
But these impressive top-level wins never seemed to be followed through with any meaningful detail. It all seemed like roadshow that was only marginally adapted for local consumption and that’s surprising for a company that’s so deeply embedded in the T&T market.

Now that Microsoft has moved virtually all its productivity and enterprise support software to the cloud and made clear its intention to pursue subscription models that can be extended to most computing devices, it’s surprising that the company would still be selling features over implementation models to the extent that it was last week.

Yammer, a Facebook-like social collaboration tool created by Microsoft was lauded for its success after being deployed in 85 per cent of Fortune 500 companies, but I don’t remember seeing a screenshot of what it looks like, far less a demo of how it works.

The company has a Digital Crimes Unit, but the company offered no details on how customers or governments have made use of its capabilities, preferring instead to tout its customer-centric policies on data access and protection.

The company offers its server side products across a range of four tiers of hybridization; On Premises, which affords the customer complete physical ownership of the IT infrastructure; Infrastructure as a Service, for which Microsoft offers cloud based hardware; Platform as a Service, in which the customer runs the software and stores the data and Software as a Service; which turns everything over to Microsoft.

The sales pitch, which was pervasive, is compelling. Microsoft’s cloud based management oversees, among other things, anti-spam, anti-virus, version control and rights management, all issues that consume significant IT department time and effort.

The prospectus for the offering was robust enough that Digicel T&T’s John Delves appeared to announce that the telecommunications provider would be a reseller of the Office 365 suite as part of its converged solutions.
It’s possible that I missed a deeper dive into the case studies that were dangled before the audience like tasty morsels, but the modus operandi of the event seemed to be to tempt and woo rather than offer details on projects that might be seen as more directly relevant to local issues.

Microsoft will probably argue that it targets specific markets with case-studies and examples and leverages its significant partner network in T&T to offer customised solutions, but a conference should inspire thinking as well as sales.

M4, unfortunately, seemed to miss that aspirational boat.

BitDepth#963 - November 18

A sensor story
Professor Eric Fossum, inventor of the CMOS sensor chip, speaks at the UWI Open Lecture last week at the Faculty of Engineering. Photograph by Mark Lyndersay.

Professor Eric R Fossum is probably the most important man in the world of modern photography, but he almost walked right past me without my knowing it.

Lost on my way to the lecture he was scheduled to give last Monday, I spotted an old school friend, who also happened to be the Chair of the Open Lectures Committee that brought Professor Fossum to T&T.
In an uncharacteristic moment of mental agility, I deduced that the person ambling along beside him had to be the feature speaker.

As the professor gripped my hand firmly after introductions, he quietly said, “Thanks for coming to the lecture.”
“Huh,” I might have said if I were forty years younger, “as if.”
Professor Fossum is the scientist who leveraged the groundbreaking charge-coupled device (CCD) technology that underpins almost all modern visual telecommunications to create a more energy efficient and ultimately more scalable revision called complementary metal-oxide-semiconductor or CMOS technology.

Moore’s Law, the axiom that’s rather efficiently predicted the rate of technological progress (it doubles every year), has been kind to these light gathering technologies, In 1971, it was possible to put 2.3 light gathering sensors on a chip, now transistors are packing in up to 2.6 billion of them.

The learned professor quickly powered through the technical part of his presentation, a dense thing made of high level physics and some folksy characterisations of his invention.
“Light is a wave,” he noted as a slide came up, “but light is sometimes also a particle.”
This was the sort of thing that always drove me a little loopy in science class and at this level, with discussions that reference “photon shot noise” and “the Poisson Process,” things were drifting firmly in the direction of hard science.

One sentence peeped through, however, a mention that scientists seem to be agreeing that we are approaching the limits of physics when it comes to pixel sizes.
Pixels, or picture elements, are the tiny light sensitive sites packed onto a camera’s sensor that actually capture the light representing your photo and translate it into the electrical impulses that become the bits of an image.

Professor Fossum began his work on the technology that would become the CMOS sensor at NASA’s Jet Propulsion Laboratory, where he was charged with making digital capture devices that were smaller, lighter and more energy efficient to fit into the smaller spacecraft that the space agency was building.
Galileo launched in 1989 with an 800 x 800 pixel sensor, but in 1992, the Professor’s team developed Intrapixel Change Transfer, a method of reducing noise and amplifying signal.

This began the era of a camera system on a chip, where the chip itself carried the technologies needed to decode the light it was capturing.
His work having arrived at a satisfactory stage for NASA’s needs, JPL took the next step with its discovery and began the process of developing technology development agreements with the giants of the day, AT&T, Kodak, Schick Technologies and Bell Labs through CalTech.

The technology transfer proved slow and required constant intervention by the team at JPL, so NASA took the uncommon step in 1995 of allowing the frustrated team to license their invention and market it as Photobit.
Photobit ran as a self-funded concern from 1995-2001, fueled by design contracts with private industry and filed 100 new patents on imaging technologies.

It was acquired by Micron Technologies in 2001 and the intellectual assets reverted to CalTech, but by then, 30 companies were working on CMOS based technologies.
“A patent never stops anyone from stealing your idea,” Professor Fossum noted wryly, “but you may be able to make them pay.”

Today, Sony and Samsung are the world leaders in CMOS chip production, ranked at one and two respectively and two billion digital cameras are being produced every year, roughly 60 cameras per second.
CalTech has successfully enforced its patents against multiple players and now the story is coming full circle, and NASA is now using the technology in space.

As Professor Fossum gets ready for his third retirement, he is fascinated by the impact of imaging technology, which he has seen bring new social issues, visual overload, rapid social change, instant communication and some inappropriate use since its introduction.

He’s also working on a new technology, the Quanta Image Sensor (QIS), which promises to deliver digital images with a more film-like exposure, through capture sensors that respond to light more like traditional emulsions did, putting a billion sensors on a chip while pulling a watt of power.

But he’s also looking at the directions that computational imaging is taking, with technologies like light field technology (exemplified by the Lytro camera) inviting emphasis now.

“Understanding business isn’t rocket science,” he argues, “it’s a lot easier than engineering.”
“When there’s a need that people want satisfied, when you can answer, who is going to buy this. There is your opportunity.”

Looking back at his work with the CMOS sensor, he boiled his learnings down into five brief sentences.
  • Create the invention.
  • Successfully commercialise it.
  • Deliver on the promise compellingly.
  • Sell the company.
  • Defend the patents.

BitDepth#962 - November 11

CWC rising
The handshake that rocked the Caribbean. Chairman of Cable and Wireless Communications (CWC), Sir Richard Lapthorne (centre) and Phil Bentley, CEO of CWC (left) congratulates Chairman and CEO of Columbus International Inc. Brendan Paddick following the announcement of the proposed merger between CWC and Columbus in London. Photo by Mark Shenley, courtesy Columbus.

On Thursday,
Cable and Wireless Communications (CWC) announced that it would be buying Columbus International the quadplay, broadband backhaul upstart that matched the challenge to incumbent TSTT that Digicel mounted in the mobile telecommunications space on the ground.

Despite the keeness of Phil Bentley, Chief Executive Officer of Cable and Wireless Communications and Brendan Paddick, CEO and Chairman of Columbus Communications to characterise the deal as a big win for the customers of both companies, the driver of this deal is, ultimately, profit.

For CWC, it’s an opportunity to substantially change its faltering position as a telephony services provider in regional markets that are largely abandoning its landline products and offer a resurgent presence as a quadplay giant.

For Columbus, the prospects are far simpler. The buyout puts US$707 million on the table for its founders and fills a giant US$1.17 billion hole in the company’s cash registers, instantly monetizing its massive cable infrastructure buildouts over the last ten years.

It is not a done deal. According to the
CWC statement on the acquisition, the company notes under “Risk Factors,” that the buyout is “conditional upon, among other things, approval from the US antitrust authorities under the Hart-Scott-Rodino Act, and relevant authorities in Barbados, Jamaica and Trinidad and Tobago.”

Regulatory approvals are also required from the governments of Barbados, Jamaica and T&T.
That these issues are lumped in with more practical matters related to the integration of technologies suggests that CWC considers these to be details, a notion robustly supported by the matter-of-fact declaration of the acquisition.

While the acquisition makes clear business sense for both parties involved, there remains the matter of everyone else involved in the regional technology dance.
Digicel, which has been on a bit of an acquisitions binge itself over the last two years,
immediately issued a release registering concern, nothing that the “proposed transaction raises a considerable number of issues for telecommunications regulation and competition generally in the region.”

Central to that is the collapse of the three player market, which saw Digicel, CWC (a minority shareholder in TSTT locally) and Columbus jostling for customers in markets in aggressive though quite healthy flux.
Of the three, the 143-year-old CWC, once the communications hub of the British Empire, was the most measured and careful force, allowing Digicel to grab vast swaths of mobile marketshare in the Caribbean and diminishing CWC’s Lime brand to 16 per cent of the Jamaican market in 2013.

In T&T, the business proposition is even more delicate for CWC and TSTT. The government owns a 51 per cent controlling shareholding in the local telecommunications company through NEL, but the relationship between CWC and TSTT has been combative for years.
CWC’s Phil Bentley described TSTT as a “failing enterprise,” in a May Business Guardian interview.
CWC would later be announced as one of the applicants for a third mobile telecommunications license for T&T.

“At the end of the day,” Bentley told the BG, “this is a business. We invest money and hope to get a return. If we can’t get a return, we will go somewhere else. We have got lots of options.”
Those options got exercised last week and TSTT is now in the curious position of having a potential competitor with detailed access to its five-year recovery plan on its board, not to mention knowing everything they need about their process and infrastructure.

TSTT’s Board of Directors and management, let by a chairman with no discernible telecommunications experience, must now face a business challenge of staggering proportions.
In the face of that, even the fate of the employees who built Columbus into a force to be reckoned with – and now look set to disappear into the considerable bureaucratic maw of the CWC machine – begins to pale.

Will the government buy CWC’s shares or bank on the Telecommunications Authority (TATT) demanding the conversion of those shares into a nonvoting investment a condition of its approvals?
TSTT is unlikely, in its current state of incremental recovery, to find a major buyer other than the Government capable of snapping up CWC’s shareholding and it’s doubtful that the public would be willing to finance an IPO on a scale that would cover it.

If CWC is granted a mobile telephony license, would it even need a business partnership of any sort with TSTT?
In a business landscape dominated by Digicel and CWC and in most of the region, a two-player market would become the norm.

Locally, what would TSTT’s role concievably be? Is it destined to become another Caribbean Airlines; a quasi-State agency funded interminably on romantic notions instead of hard business principles?
Consolidation is good for business. It leverages costly installed assets over a larger customer base, reduces employee head count and dramatically improves procurement clout.

But it also stifles competition, creating monoliths that crush startups and reduces customer choice.
TATT has already made it clear that it wants three-way competition in the local telecommunications sector. Now it must make sure that the playing field remains level and fair in this new dispensation.

BitDepth#961 - November 04

A day without the news
Delphine Hagland, Santiago Ryan and Antonio Bolfo speak at PhotoPlus Expo last week. Photo by Mark Lyndersay.

Increased danger in field reporting has grown troubling enough that it’s not only captured the attention of the United Nations, it led to the strengthening and re-emphasis in December 2013 of the 2006 Security Council Resolution 1738 following recommendations from Reporters Without Borders.

That hasn’t slowed the appalling statistics offered by the US Director of the journalist’s organisation, Delphine Hagland at the opening keynote of PhotoPlus Expo, A Day without News, in New York last week.
There are 177 journalists in jail at last count, along with another 178 bloggers and net-based citizen journalists.

Of the 71 journalists killed in 2013, 39 per cent of them died in a war zone. There have been 56 journalist killings in 2014.
More than 80 per cent of such crimes go unsolved and 94 per cent of the killings targeted locals covering news in their homeland.

The incidence of such indifference to these crimes is so high that last Sunday, November 02, was marked as the second Day to End Impunity, an effort to make nations aware that allowing and even encouraging situations in which the killers of journalists can escape trial or punishment is unacceptable to the global community.

Antonio Bolfo, a Reportage by Getty photojournalist emerged as the star of the discussion, an impassioned, disturbingly young and surprisingly experienced veteran of conflict zones both in America and abroad.

Bolfo is one of hundreds of young photojournalism graduates and freelancers who have chosen to pursue the news, but he recently drew the line at returning to Syria, where journalists are being specifically targeted by ISIS militants.

“You find yourself in an environment in which fixers, their family and friends are rewarded for turning over journalists,” he explained.
“Even if you can trust your fixer (a local person engaged to provide or source services or contacts), can you trust everyone they know?”

“Why are we taking the risks we take today?” he pondered aloud.
“Young journalists wanting to break into the field are willing to take more risks, but here’s a need for security protocols and safety procedures.”
“Journalists must place more emphasis in planning and preparation. You simply can't go into a conflict zone without proper training, insurance and a satellite phone.”

“In some conflict zones, journalists are worth ransom money,” said Ron Haviv of the news photo agency VII.
“When it's about money and not ideology, things are very different.”
“And it's not just about money,” Hagland added.
“There is also the widespread publicity that the ISIS killings brought to that organisation.”

“Twenty years ago, reporting in the Balkans, the reporting was remote,” explained Santiago Lyon, AP’s Director of Photography.
“The immediacy of news has changed the dynamic. Our subjects are now aware of the impact of the news and the immediate value of engaging with journalists – to ransom and to execute them.”
“AP has a reluctance to take unnecessary risks,” Lyon explained.
“There are fewer agencies now, and those that remain must evaluate their responsibility to freelancers.”

To bridge that gap in coverage, AP has been working with user-generated content from activists, but now finds that ISIS is using the same channels to tell their version of the story.
“We do evaluate all user-generated content according to the circumstances of its posting,” Lyon said.
“We want to be right, to be accurate when we publish, but we are also managing pressures from customers who want news right now.”

Moderator David Friend, editor for creative development at Vanity Fair, noted that such pressures do not only happen in foreign countries.
“There has been a chilling effect arising from intimidation techniques both in the US and abroad,” Friend said.
“Even in democratic societies there are moves to limit access to information, and threats to whistle-blowers and those who report on their revelations continue.”

BitDepth#960 - October 28

On the count of four
Samsung’s Gear S marries the smartwatch features of the original model with the fitness focus of the Gear Fit with a larger screen and more phoneless features. Photo courtesy Samsung.

A week ago, Samsung introduced their four flagship smart devices for the local market to a small group of influencers and journalists at the Hyatt Regency in T&T.

Over the last few years, the company’s touring events have fluctuated in size, but this year’s seemed unusually cozy, with just a dozen people in the hotel’s boardroom space gathered to hear the pitch for the four products that will be officially launched to the public on November 01 at the Samsung Experience Store at Gulf City Mall.

Of the four, only one is pushing aggressively into a new market space. Samsung’s new Gear S is a beefier version of their wearable computer that seeks to liberate the smartwatch from being a satellite to a smartphone. 

The Gear S, running Tizen, has specifications that would have been acceptable on the smartphones of five years ago, with 512MB of memory, 4GB of solid state storage, a SIM slot and its own 3G and WiFi radios. When paired with a smartphone, users can finally channel their inner Dick Tracy and make and receive calls directly from their wrist.

Samsung uses the 360 x 480 pixel screen to show off dazzling graphics that mimic the look of high-end phones, but this is a screen, curved and sharp, that looks usable for doing something beyond checking the time, health stats and e-mails.

The Note 4 is an evolutionary upgrade to the company’s market-defining line of larger smartphones, and it’s one that’s surprisingly small, tidy and neat for its physical specifications. At least part of the reason for that is Samsung’s thinner bezel, which pushes the screen much closer to the edges of the device, creating the illusion of a smaller box.

Samsung has invested significant design work in making the phone feel lighter, slimmer and smaller generally while preserving the large screen that’s always been the device’s big draw.

The new Galaxy Tab S is an evolutionary upgrade to the company’s underrated line of tablets. The 10.5 inch screen is sharp and crisp, packing in 2,560 x 1,600 pixels and the processor is crisp and responsive.

The company has mercifully dropped the faux leather of the previous version in favour of a sleek finish and profile that’s likely to get dropped into an appropriate protective case at the first opportunity by any serious user.

The Galaxy Alpha is the curiosity of the bunch. A slightly smaller device than the flagship Galaxy S5, the Alpha is an almost perfect first smartphone, but it’s also a no compromise device with specifications that place it close to the S5 Mini with a price to match. 

Final local pricing is likely to decide the positioning of this device in the T&T market, but Samsung could do more to improve its pricing and feature differentiation among its midrange devices.

The smartphones and tablet all run Android 4.4.4 with a refreshed TouchWiz (Nature UX3.0) interface that’s fashionably flatter and more colourful that previous versions.

Beyond those cosmetic touches, improved device specifications and shaving down of the physical size of the devices comes the decision to replace the long-standing “menu” button that’s lived next to the home button on Samsung’s smartphone and tablet devices for, well, forever, with a “window” button.

This one small change is likely to cause all kinds of muscle memory spasms for long time Samsung device users who are used to accessing additional app features using that feature.

By putting multi-window access just to the left of front and centre on these new devices Samsung is sending a not so subtle message that its efforts at building differentiation through more convenient multi-tasking will be a core of its future feature set for TouchWiz which is evolving from a desktop metaphor to a more aggressively touch driven interface that makes more use of icons for visual cues.

BitDepth#959 - October 21

Digicel goes hardline
DigicelTT CEO John Delves speaks with the media after the company’s launch of Fibre to Business on Thursday. Photo by Mark Lyndersay.

The spotlight for the evening at La Cantina last week was supposed to fall on Digicel’s new offering, Fibre to Business, a cable based enhancement of its thriving Digicel Business initiatives.

But CEO John Delves couldn’t just leave it at that. Flush with the excitement and buzz in the crowded room and the enormous support present for another provider of cable based broadband, he let the other shoe drop. 

All that’s standing between Digicel entering the cable entertainment business is an approval from the Telecommunications Authority. The company is ready and apparently champing at the bits, as it were, to become a quadplay provider, offering mobile and landline telephony, broadband access and cable entertainment.

A lot of that’s likely to be riding on the successful deployment of the company’s local infrastructure, a huge and costly undertaking that’s underway in Port of Spain and will be deployed in major business centres in the East and South of T&T in the coming months.

“Digicel Business,” noted CEO John Delves, “is the fastest growing sector of the company.”
To support that aggressive growth, Digicel plans to build out the majority of its fibre optic cable infrastructure for business within 12 months, with service level agreements for uptime for customers of its cloud based solutions and upscale services for larger companies like Metro Ethernet.

“People think fibre is only for big business,” Delves said, “but this solution targets small and medium-sized businesses who want a competitive advantage.”
To that end, La Cantina wasn’t just a launch location for the new service, it was also a satisfied customer of the new Fibre to Business product and owner Kester Sylvester, who had pre-recorded his testimonial, eagerly took the microphone to offer his own impromptu live endorsement of Digicel’s commitment to offer site specific solutions for his establishment.

With 100MB connectivity and customised installation on the menu, Sylvester was swooning at the difference in his restaurant’s WiFi offerings.
But this infrastructure deployment is only the first pass that Digicel will be making at its customer base.

The company has made
big investments in regional broadband infrastructure, buying 3,100km of submarine fibre optic cable from Global Caribbean Fibre in December 2013 and Guadeloupe’s Loret Group and Caribbean Fibre Holdings in early September, adding data backbone connectivity to 12 countries in the region from T&T to Puerto Rico with connectivity to the US.

Digicel has to build its infrastructure in T&T, Barbados and Haiti from scratch, but it’s been briskly buying existing fibre optic networks in other countries over the last 11 months.
In July, the company bought Telstar Cable’s cable and fibre network in Jamaica. It acquired WIV Cable TV and its broadband subsidiary, TCT in the Turks and Caicos Islands in April. SAT Telecommunications in Dominica was added to Digicel’s portfolio in February and Caribbean Cable Communications Holdings of Anguilla, Nevis and Montserrat was bought in November 2013.

Digicel’s hopes to go quadplay and to branch into cable telephony and entertainment might be making Flow and TSTT nervous, but the company hasn’t stopped there, and media houses might want to take note of what comes next.

Digicel is now dipping its toe in content creation, buying a majority stake in St Lucia based regional cable sports agency SportsMax in September from International Media Content Limited.

In October, the company trumpeted the success of its regional news services, LoopTT (there is a Jamaican and Barbadian edition) and CatchOn Sport, both of which emphasise pithy news reporting for a younger audience reading on mobile devices via a mobile optimised website and dedicated apps for multiple handsets (currently available for Android only).

After a month of operations in T&T, Trend Media, a wholly owned subsidiary of Digicel launched in August, claimed in a press release from Digicel to have “revolutionised the Caribbean media landscape,” and will next “pioneer innovation in online and mobile advertising solutions.”

While Digicel has business operations in emerging markets around the globe, it is spending big in the Caribbean region and intends to consolidate the presence it began in 2001 with its first office in Jamaica.
“The Caribbean is the [business] core,” John Delves noted, “It makes sense to invest here.”

BitDepth#958 - October 14

The photographer’s selfie
A bit of photographic meta. A self-portrait of a selfie in progress. Photo by Mark Lyndersay.

A few weeks ago, I
posted my first selfie on Facebook and it occasioned a small bit of fuss among the circle of friends who commented on it.
Evidently, I was believed to be widely opposed to this widespread phenomenon of self-documentation, or at least, it seemed, I was expected to be.

Part of that is likely to be because I don’t tend to take many photographs of myself. As I often tell skittish portrait subjects by way of empathy, I’ve spent an awful lot of time and energy getting on the other side of the camera.
It’s an aspect of a series of
posts about portraiture that I’ve been working on for my photo blog, and one that I've never fully considered before now.

And yet, I’ve got quite a few photographs of myself on file. Among them a black and white multiple exposure I took while working out that technique from the early 1980’s, the “Mohawk for Carnival” photo and of late, far more frequent annual updates to meet the voracious needs of modern social networks, which quickly tire of the same thing posted for too long.

I’ve got at least one Facebook friend who changes her profile photo with the reliability and speed of Big Ben, pretty much rotating in something new or rarely seen on what seems like an hourly cycle.
It’s a thing that I’ve wrestled with for a long time, particularly since at least part of my business is based on the need to update and improve corporate images among people for whom the perception of self is as important as the reality.

That’s why I find it so surprising that creative people are so often terrible at putting a meaningful image up as their avatar, the digital representation of self so pivotal to online interaction that there’s a service,
Gravatar, dedicated to it.

Among my creative colleagues on Facebook and Twitter, there are still a few outline heads and eggs, the default icon that pops up when you haven’t uploaded a photo representing yourself to those services, along with a smattering of bizarre images that have nothing to do with the people I know.

When it comes to the Internet generally and social media in particular, I’ve found that most impressions are based on a quick read of “about” pages and anything personal.
Yet it is in these places, now public on a scale without precedent, that we seem most comfortable with being seen as quirky, strange or complicated.

There are some people who want to be seen as exactly that and others who will argue that I manage to be all three despite posting a photo revealing same, so there’s that.
A selfie can be a remarkably informative thing. I know more about the sons and daughters of friends by what the photographs they post of themselves than I do from anything I get told by their parental units.

There is, however, a gulf of difference between a selfie, which tends to capture an engaging and personally appreciated moment for further appreciation, and a self-portrait, which tends to be a more deliberate and considered effort at defining oneself visually.

Both are intended for public consumption, but they tend to serve quite different needs, though the inevitable mash-ups that digital technology encourages have tended to blur the line between the two.
Of late, I’ve been considering the difference between the two as I began to rethink the value of call cards.

Do people look for your call card or search for you online first? I’ve begun to think that Google is the new directory for finding people and that an online call card might be a good idea.
While preparing to post on
the popular people finder service About Me, I also decided to press into service a domain I’d been holding for the last four years to keep it out of the hands of name poachers as a prototype online call card.

Increasingly, we are all going to be evaluated by the images we circulate online and will be measured and understood less by what we write than by the pictures we post. The pictures we post of ourselves will be the adjectives of those declarations.

BitDepth#957 - October 07

Windows 10, anyone?
Windows 10 includes gracenotes from both version 7 and 8 to create a product that’s a bit of the best of both.

On the very last day of September, Microsoft introduced the newest version of its flagship operating system, Windows 10.
What’s that you might be asking? What happened to Windows 9? Don’t ask, the Internet has already answered that, for the most part with snide humour. Hit a search engine to find out for yourself.

And this isn’t exactly Windows 10, at least not yet. Microsoft makes it clear that this version is subject to change, and the product is still very much under development.
If you decide to play around with it, be aware that it’s a Technical Preview, very much a beta product and one that Microsoft is rather boldly circulating widely in the hopes of gathering feedback about how the product works for its customers.

Since releases like this are normally released only to developers, it’s clear that the company acknowledges the missteps of Windows 8, a brave and largely unwelcomed effort to merge a tablet OS with a desktop interface.
The company has
posted a blog with an embedded video that shows off the new design features and approaches.

In the blog, Terry Myerson, Microsoft’s front man for the launch, neatly conflates the blurbs about Windows 8 into the vision of new CEO Satya Nadella: “This new Windows must be built from the ground-up for a mobile-first, cloud-first world.”

The new Windows offers some nuanced touches that are clearly designed to meet the expectations of users who have become comfortable with Windows 8 as well as those who have staunchly resisted upgrading and remain on versions 7 and earlier.
That’s no small figure.

As of July 2014, Windows 8 adoption was 12.5 per cent of Windows installations, compared with over 50 percent for Windows 7.
By September, that number had inched up to 13.37 per cent for both Windows 8 and its healing upgrade, 8.1, but that put the OS just below the widely reviled Windows Vista (14.3) and well below the now formally abandoned Windows XP, which tenaciously holds 23.89 per cent of the Windows install base.

The first thing that the intrepid user will find is the return of a full start menu, not the lame halfway effort included with the 8.1 upgrade.
Not only does it look like the old start menu, it’s actually better, adding selected tile apps to the right of your application launch list, giving users a quick update on what’s been happening in their digital lives.

It’s an inspired use of the Modern UI tile-based interface, so controversially introduced with Windows 8 and you’ll find those apps appearing in more sensible ways as you work through the new OS.
Which isn’t to say that it’s perfect. Some tiles that you’d expect to be live sit silently in the start menu and others update with such speed that it makes the menu a bit dizzying to use. You can also resize it upward, but apparently not sideways

The Task View, a merging of Expose and Spaces on OSX, is an effort at organising app windows, virtual desktops and icons for hard core users, but it’s a bit inconsistent in its execution.
Clicking on the Task View icon or typing Alt Tab gets you the same familiar view of your workspaces, but Windows Tab delivers something quite different for no apparent reason.

There’s also needless conflict between the Start Screen and the Start Menu. Microsoft feels that you should choose one or the other. I have no idea why.
Tile based apps are no longer bound to the digital ghetto of the Start Screen and now launch on the desktop, with their own windows in the desktop interface. This is likely to be a great boon to the developers who committed to creating software for the Windows App Store.

The Windows 10 Technical Preview isn’t for everyone, Microsoft specifically offers the software for “PC experts and IT Pros” through its
Windows Insider Programme though users frustrated with Windows 8 may be tempted to go for the new features early.
My own installation, which downloaded briskly and installed even faster in an old VMWare Fusion virtual machine takes up 10GB on disk, not bad for a modern OS installation.

But this is a version of Windows for adventurers, columnists and the technically adept. Using is on a production machine is probably inadvisable, no matter what your friends tell you.

That said, Windows 10 shows far greater promise of becoming that chimeric hybrid that the company was seeking when it released Windows 8 and most compellingly, it’s the product of a company that’s been listening to its customer base. That’s never a bad thing.

BitDepth#956 - September 30

Passport to where?
BlackBerry Executive Chairman and CEO John Chen holds the company’s new Passport phone aloft at the launch event on Wednesday last week. Photo courtesy BlackBerry.

In the week of Bendgate, with flocks of pigeons flocking with diarrheal enthusiasm over Apple’s launch of the iPhone 6 Plus, it was hard to imagine that another technology company might be able to rival the misfortunes of the Cupertino company and its allegedly overly flexible new phone.

But one only needed to look north to Toronto to find the once erstwhile and market-leading BlackBerry offering up a new smartphone to skies thick and dark with corbeaux with their bladders full.

If there’s anything worse than bad press, it’s got to be no press, or so little of it hardly matters. Even worse than that is the prospect of marginal market share for not just a new product, but one that represents BlackBerry’s last, best hope for a presence in the pockets of business users.

At a cozy event on September 24 in Waterloo, Ontario and in the company of famed hockey player Wayne Gretzky, BlackBerry CEO John Chen introduced the new Passport device, a squat looking little smartphone with a design influence that seems based on a…passport.

If you’re looking for style, seek out the BB Porsche Design P’9983, a smartphone with far more impressive design pedigree, but BlackBerry is positioning the Passport as the device that its dedicated users will want to migrate to.

It’s pretty ham-handed positioning though, the P’9983 will be available through Harrods at the Porsche Design in-store shop for £1,400.
The company is playing a careful and deliberate marketing game here, announcing on its home turf and claiming Canadian carrier Telus as its first official partner with accompanying discounts for the first weeks of sale.

AT&T has been announced as its official US carrier, with 30 selected territories to come soon (T&T isn’t one), though the phone will apparently be widely available unlocked at a US price of $599.

That’s already proven to be a key factor in the phone’s appeal to its audience. Within hours of its release, BlackBerry was blogging about the 200,000 orders for the device on Amazon that had skyrocketed the Passport to the top spot among smartphones on the company’s US store.

For the average user, the Passport hits some modern notes as well as some old favorites. The BlackBerry keyboard is back, for one thing, along with a hefty 3,450 mAh capacity battery that’s rated for 30 hours of average use.

The new keyboard, which is generating mixed feelings among the BB faithful, also functions as a trackpad with a swipe across its surface.
As with other recent BlackBerry models, this device runs some Android software, which users can access through the preloaded Amazon Appstore app.

The phone’s specifications are a distinctly average; Qualcomm Snapdragon 801 2.2 GHZ quad-core processor, 3MB RAM, 32GB storage and a 13MP camera.
Earning a place in the pockets and clasps of today’s executives will be a challenge for the hefty phone, which weighs in at heavyweight class at 5.03 inches tall, 3.55 inches wide, and 0.37 inches thick. It also bends the scales at 6.91 ounces.

BlackBerry is putting it to their users that they really need a square tank to be efficient, and they may be the only smartphone maker with a business base dedicated enough to survive the asking.
With a square screen, this isn’t a device for looking at movies or YouTube videos, it’s dedicated to creating an old-school workspace with a distinctly modern pixel density of 453 pixels per inch on a 4.5 inch screen. Words, spreadsheets and presentations will be crisp and sharp.

From the perspective of BlackBerry’s recent smartphone history, the Passport is actually a strong entry in the market, but the market has also generally moved on to faster, smaller, sleeker devices with far richer software ecosystems.
I’ve said it before, and I’ll say it again.

Good hardware is the price of entry into this competition for smartphone customers, software is where they are won over and BlackBerry hasn’t done a thing worth noting on that front.

BitDepth#955 - September 23

Battle of the phablets
Apple’s new iPhone 6 Plus (right) is going to spark some interesting changes in the smartphone market. Photo courtesy Apple.

For the last six years or so, I’ve been an Android guy as far as smartphones go.
I’ve used Samsung’s devices specifically, from their shaky start as an iPhone challenger with the S1 through to the capable and competitive S4.

There’s a lot to like in Android. If you use Google, your phone becomes an effortless extension of everything that Google has to offer, from search to web apps.
Google open door to developers of all kinds results in an app store for Android that’s riddled with iffy software.

The steady growth in the availability of a much wider range of useful software for Android has helped, from the monolithic presence of Microsoft Office to tiny but important projects like Sunrise, which pulls your iCloud appointments into Google’s universe.
But let’s be honest here.

The Android app store can’t compare, particularly for creatives, to the thriving abundance of apps available for Apple’s iPhones.
At least part of the problem is the curious problem of Android fragmentation. OpenSignal reports that in August 2014, there were 18,000 individual Android devices in the market awaiting the attention of a potential developer.

Most of them can’t be upgraded to the current version of Android, so developers either target the most recent devices, leaving older phones and systems out in the cold or delay Android development in favour of software for iOS.
This is where the relative blandness of Apple’s product line offers its strongest market value, both for users and software developers.

Apple updates its iPhones on a cycle that runs between eight and ten months and maintains backward compatibility for roughly four generations.
That means that while iOS 8 really delivers for the new iPhone 6 models, it can be expected to run well on the 5s, 5c, 5 and 4s devices.

That 4s compatibility is a bit of a checklist item though. The Internet is currently a hotbed of complaints that the 4s becomes a turtle slogging through hot asphalt after the upgrade.
My own reaction suggests that the new releases, both software and hardware, will be just the wake-up call that Samsung needs right now.

For one thing, I’m looking very, very seriously at the iPhone 6 plus. It’s a honking big phone, but I’m a honking big person and if I invest in it, it won’t just be to use a phone, it will be to condense my need for a small tablet and a mobile communications device into one, admittedly large, box.
All the bells and whistles of Apple’s design and software ecosystem matter less to me than being able to work more comfortably on a device I can put in my pocket.

This is likely to be a consideration for other folks as well who are discovering that 4G connectivity, regardless of your carrier of choice, has quickly become a powerful enabler of responsiveness and connectivity.
Apple’s new phablet is a direct shot across the bow of Samsung’s Note series, a genre defining line of hefty phones that, to my surprise, I’ve been finding in the hands of quite several female executives who traditionally carry their phones in their handbags.

This is also where Apple needs to be ready to step into the future if the new device takes off. People who were waiting to buy an iPhone will buy one, but the Plus is a different market category that’s going to demand specific attention if it’s going to succeed.

If it does, Apple is one of the few companies with the capacity to scale to demand. The company reputedly can churn out 45 million iPhones per quarter, if the Plus takes off, Samsung, which also happens to one of those companies with production capacity, may find itself in an all out market war.
This is a far more likely possibility than even a skirmish in the smartwatch category, which remains unproven and experimental at best.

Samsung would do well to realise that this isn’t a battle that’s going to be won on specifications. The upcoming Note 4 will feature a slightly larger screen at 5.7 inches to Apple’s 5.5, higher screen pixel density, double the megapixels for the main camera and a processor that’s supposed to be faster.

Samsung’s Note series offers a bigger, more comfortable device for serious smartphone users, but the iPhone Plus is something else entirely for Apple, a product that’s likely to cannibalise iPad Mini sales and become more of a handheld computer than a phone. That’s likely to change the market perceptibly.

To meet that challenge, Samsung would be well advised to drop the crapware from their upscale devices.
With the time to see how the iPhone Plus is being used, they can make deals to pack the new Note with apps that will make it a real competitor in usability.
That’s going to be the terrain for the upcoming battle of the phablets.

BitDepth#954 - September 16

Social intervention, new media style
The popular Facebook pages of the issue specific social intervention websites Bad Parkers of T&T and

In an era in which everyone not only has an opinion, but multiple avenues for expressing those thoughts, it’s surprising that there are few efforts at making sustained social interventions using readily available Internet tools.

Most issues earn a rash of comments, usually on Facebook, the occasional Twitter flare-up and the odd petition.
At least two people have managed to be so irritated at a perceived injustice that they have created a continuing, Internet-based effort to keep discussion alive on matters that concern them.

Roger Jackson, a Hospitality Management consultant for the last 35 years, created and runs the popular
Facebook group Bad Parkers of T&T which has attracted 3,444 members in just three months.
“I have been observing the way people park at malls, multi storey car parks, banks, groceries, churches, offices and basically anywhere that people park,” he explained.

“As OCD as I am, I thrive on perfection and seeing cars parked outside of lines, in handicap parking spaces when they should not be or simply parking badly causing inconvenience to others really started to bother me and I thought I would share this with others who I know feel the same way as I do.”

There was a personal spark for this project, and it was one that stuck in Jackson’s craw.
“I was trying to park at a mall a few months ago with my mum, who is 93 and needs a wheelchair to move around, when I [found I] could not use the handicap parking because it was already occupied.”

“I drove around the parking lot and finally found a spot where I could park my car with enough room to open the front door to manoeuvre my mother into her wheelchair, when along comes another car and pulls in the spot next to me and basically refused to move even though I explained about the wheelchair and my mother.”
“To me that was an inconsiderate bad park. There were other spaces available.”

From the start, content flowed to the page, the first surge of members quickly getting the idea of capturing acts of automotive negligence with their cameraphones.
Now members post “every hour on the hour,” and Jackson’s main role is in managing a thriving though occasionally rowdy group.

“I have not had anyone challenge me on an offensive post, but I have had to referee between a few members when their posts got out of hand and quite personal.”
For such a large and impassioned group, it’s a bit surprising to discover that Jackson has only had to eject two members, but as he notes “they both really did not belong in the group.”

The anonymous owner of, a website dedicated to outing women accused of breaking up relationships with a sideline in deadbeat dads, was cagey about contributing to this column.
supporting Facebook page has won 15,222 likes and even spawned a dissenting page on Facebook, Shut Down Trinihomewreckers Website Now, which has attracted a rather less daunting 185 likes since it was created in March.

The website, which logs thousands of visitors per day, was started in November 2013 as “an outlet to allow persons to vent.”
“Let it all out by sharing your story,” the TriniHomewreckers proprietor wrote in response to e-mailed questions.
“I know that there are families that are hurting because of infidelity and betrayal. I have seen families torn apart and children hurting because of this. I also think that others experience will inspire, and to some extent deter, persons from making the same mistakes. I hope it does.”

As cautionary tales go, the posts are both raw and disturbing in their bluntness. It isn’t an easy read if you aren’t already angry about being horned or abandoned and most of the posts veer between slut-shaming and outright libel.

That hasn’t stopped businesses from advertising dating and event planning services on the popular website.
The website got off to a slow start, the owner explained, but then one of the stories went viral and “people became less afraid and began sending in their contributions.”
“The response so far has been overwhelming.”

Dissenters to the concept note that it’s possible for stories to be one-sided or just plain maliciously wrong, but the site’s owner notes that her experience is quite different.
“I read the comments on the website daily, and I have seen readers go back and forth with each other on the topic of family life, infidelity, adultery, betrayal, love and honesty.”

“I can honestly say that the wider community will benefit from frequenting my website as these topics are addressed and discussed in depth. There isn’t even any need for me to give advice, as my readers seem quite capable of doing so.”

There’s a longer interview with site’s owner under the unlikely name of
Krystal Bleu here.
TriniHomewreckers has a forum, but it’s barely used; the comments on posts are far more active. Both site owners make extensive use of Facebook’s community tools and potential audience, though Jackson hasn’t bothered to create an independent Internet presence.

“I prefer Facebook as it is a very popular internet site with a massive market all in one package.” “ÒIt is a household and business name. The world is Facebook!”

BitDepth#953 - September 09

Another laptop story
Is the future of education in our children’s laptops? Photo by BigStock.

Last week, the Ministry of Education scampered to respond to a claim by parent Julien Dedier that his daughter’s Government-issued laptop had been infected with spyware.
Education Minister, Dr Tim Gopeesingh then issued a statement that noted the “alleged discovery of this spyware,” as well as the “non-reporting of this incident to the Ministry of Education.”

The issue prompted Dr Gopeesingh to dive into a spiral of counter claims, claiming “strong administrative policies governing the laptops.”
“We have firewalling, we have anti-theft and anti-virus devices,” the Minister said, noting that unauthorised installation of software to the State’s property was undertaken.

Information Security expert Shiva Bissessar of Pinaka Technologies had questions of his own on the matter.
He wondered what Dedier had found and what tools he used in the discovery process.
“If there was a report from a reputable firm describing a methodology for scanning several freshly delivered laptops direct from the manufacturer or the ministry, which then revealed malware, this would be a source of concern,” he said.

Bissessar also warned against taking the Ministry’s assurances at face value.
To discover more about the real world experience of school laptop use; I turned to a tech savvy user with a child who has used one for the last few years.
That user, who asked for anonymity and identity obfuscation because of the prestige school her child attends, answered questions about her experience with the system.

The computer, a Lenovo e425, was one of the 75,000 laptops issued by the Government to students entering secondary school over the last four years at a cost so far of more than a quarter of a billion dollars.
“It’s a basic Wintel laptop, with a low end processor, 2GB ram and a 300+GB hard disk. Nothing that you would buy for yourself, but adequate,” she explained.

The school’s librarian runs an information technology literacy programme for incoming form one students.
She’s had to do maintenance on the computer over the years. The school has one part-time technician who oversees 600 Government-issued computers, a number that will jump to 750 in October.
Her child’s machine is one of the few from her cohort that still works.

“If there is a problem, I assume you can take it to him but there is nothing preventive.”
“I do maintenance from time to time so the machine is in good shape. However, I had to (ahem) subvert certain security controls, in order to do it.”
“Most parents could not do this, and the other laptops I have seen, those that can still boot, are in a mess.”

The child uses the system regularly, mostly for research and typing, but also for viewing and drawing manga, a Japanese comics artform.
One major failing of the laptop programme appears to be a lack of continuity in the anti-virus software precaution. Most AV software is offered on a subscription model and expires after a year.
“Because I had taken control of the system I was able to replace the AV after the one-year subscription expired,” my source explained.

“I actually called the Ministry and spoke with the programme manager. He told me to speak with the school technician, but he said that he did not have the software. I got the impression that extending the AV for the life of the system had not been thought through.”

At least one major bit of surgery was undertaken on this system during its lifespan.
“The hard disk died after two years so I called the agent, and they said that they would charge $250 to look at the system and that charge would be applied to any repairs.”
“However, I did not like the idea of paying to fix a machine that does not belong to me. I had a spare drive and a Windows 7 license hanging about, so, with the permission of the principal, I fixed the machine.”

This rental arrangement sets up an issue for parents overseeing these laptops.
“The GoRTT [contract] explicitly states that maintenance is the parents’responsibility after the first year. What I think happens is that many parents prefer to spend the money on a new laptop of their own, and the eCAL laptop is left to rot once it develops problems.”

Over the years, my source has no sense that teachers have been incorporating IT in the classroom.
“A one or two-week programme is not sufficient to change your teaching practice if you have taught in a particular way for many years,” she notes.

She has seen a Master’s thesis on the eCal programme that has found teachers struggling with the project and many, for the most part have given up, some even before getting started.
The issue raised by Julien Dedier sparks larger questions about the laptop distribution programme which appear to run counter to Government supplied information.

How exactly are parents supposed to perform maintenance on a system that they cannot access as it is given to them?
Why claim anti-virus protection when it expires after a year with no simple option to renew?
Has the government reviewed the technician to computer deployment ratios in schools?
How many government supplied laptops are still functioning and in use from that first deployment?
Is the honest opinion of teachers tasked to work with these systems being sought in order to improve the pedagogy?

I have only one source to rely on for my information and while that report isn’t as sensational as the one that made news last week, I suspect it merits even greater attention from the Education Ministry.

There has been a lot of positive interest and hope for the widespread introduction of computers to schools in T&T, but these are questions that have been muttered in school hallways and in tech circles from the start.

It’s time that they were answered.

BitDepth#952 - September 02

Ultra High Def TV and you
Samsung’s U9000 UHD series curved television displays on show at the company’s T&T launch at Queen’s Hall on Thursday evening. Photograph by Mark Lyndersay.

On August 28, Samsung introduced its newest line of curved televisions to Trinidad and Tobago, the U9000 UHD series, available in two sizes, 65 inch and 55 inch.
These are undeniably handsome video display devices, with crisp rendition and brilliant colour.

They aren’t Samsung’s first curved televisions or their first large screen UHDTV devices either.
A year ago at IFA in Berlin, the company revealed their first curved televisions and a line of large screen televisions.
The show stealers at that event were the company’s 85 inch and 105 inch UHDTV screens, which ran 4K content designed to make them look not just good, but commanding.

In addition, Samsung has loaded the displays with a range of software tricks that are designed to make the picture look even more impressive.
Enhancements like Auto Depth Enhancer which, according to
the company’s press release, " automatically adjusts the contrast for greater depth perception,” and PurColor™, are designed to make the screens vivid and more lifelike.

In reality, the screen images are hyper-real; a bit too saturated, too sharp and too startling to be mistaken for reality, but that only makes them more clearly suited for the latest cinematic masterpiece from Michael Bay.
Samsung might save us all a bit of trouble by creating three simpler settings for these displays, “Action Movie,” “Nature Special” and “Sports,” because those are the genres that will absolutely sing on these screens.

But customers of UHDTV displays are going to run into another problem long before those enhancements come along, and that’s finding content that’s capable of filling these curvy displays with the pixels they hunger for.
Standard definition TV, better known as “the stuff we’ve been watching for years,” tops out at around 704 pixels per inch on the longest side.

Translating that into something you might be more familiar with, that’s roughly .4 of a megapixel in camera parlance.
DVD’s improve on that negligibly, though the compression applied to the signal is far less aggressive, so picture quality improves.

Blu-Ray at its best bumps that to 1920 pixels, giving us an image equivalent to two megapixels.
That isn’t something to underestimate. Most video is encoded at 30 frames per second, so every minute of Blu-Ray video pushes 124 megapixels worth of data to a high-definition screen.

Now the film industry, which already is struggling to upgrade the typical user from DVDs to Blu-Ray, is being confronted with a format capable of delivering 8 megapixels per frame, with its successor, 8K on the horizon, which delivers 33 megapixels per frame, within shouting distance of IMAX quality.

These are big numbers, particularly when multiplied by the video standard of 30 frames per second and it’s unlikely that a new disc format will emerge in time to capitalise on the demand for UHDTV displays.
Nor are television manufacturers hesitating to push the new technology. They need a selling point to get owners to upgrade, and 3D television turned out to be a humiliating bust.

Fortunately, or unfortunately, depending on your perspective and/or religious beliefs, there is an online video market sector that has pounced aggressively on the potential of UHDTV.
The purveyors of naughty movies have been adding 4K options to their downloadable files since the end of last year. Expect mainstream video sites like iTunes and NetFlix to begin gearing up to follow suit soon.

Until then, UHDTV users will have to depend on the upscaling capabilities built into these sets. Samsung introduced its Quadmatic Picture Engine at IFA in 2013, but upscaling from a traditional television signal or a DVD isn’t going to feed these pixel hungry beasts.

And the curved screen? You need to be within 13 feet of the screen for that to make a difference
as this video explains, so if that doesn’t make sense for your room setup, you’ll probably be happier with a flatscreen.
UHDTV is still a bleeding edge, content starved technology and the value of curved displays is yet to be tested in the market, but there’s no denying that the screens, particularly with high definition content, are just plain awesome.

BitDepth#951 - August 26

Talking Net Neutrality
Part of the audience at the start of last week’s discussion on Net Neutrality. 
Photograph by Mark Lyndersay.

The Room 101 at UWI’s Engineering Building was surprisingly full for the discussion on Network Neutrality hosted by the Trinidad and Tobago Computer Society (TTCS), the Internet Society Trinidad and Tobago Chapter (ISOC-TT) and the Institute of Electrical and Electronic Engineers Trinidad and Tobago Section (IEEE-TT).

The room number was also ironically appropriate for an event that set out to explore the importance of net neutrality by first creating a platform for understanding the concept. Disclosure: I was invited to moderate this panel discussion.

At it’s core, net neutrality posits that the underpinning of an open Internet should be the principle that service providers and governments should treat all data on the Internet equally, not discriminating or charging differentially by user, content, site, platform, application, type of attached equipment, and modes of communication.

It was this idea that drove the explosion of the modern Internet after 1990 when steady increases in technology and most importantly, bandwidth, allowed the creation of ever more adventurous software and technology platforms.

The problem is that any commons is always going to be overgrazed, regardless of how idyllic it seems.

The conversation last Wednesday constantly hovered around the decision by Digicel to block access to selected Voice over IP (VOIP) software last month.

In announcing the block, Digicel accused the services of “bypass activity,” noting that, “Unlicenced VOIP operators like Viber and Nimbuzz use telecoms networks to deliver their services, but do not pay the requisite money for the privilege.”

The company blocked access to Tango, Viber, Nimbuzz and Fring, but never responded to questions regarding the status of Skype and MagicJack, which were not blocked.

Four days later, in the face of a growing negative reaction and the prospect of formal involvement by the Telecommunications Authority of T&T, DigicelTT reversed its ban on the services, though the ban remains in force in Haiti and Jamaica, the first countries to experience the communications blockade.

That very day, the TTCS
issued a statement on the matter in which it countered that the company’s arguments for instituting the block were technically “unsound.”
Specifically, “VoIP services do not present a significant load on the mobile data network and their current network does not allow them to prioritise data packets by content.”

The discussion emerging from that sequence of events followed the kind of arc you might expect. Digicel was out of sync with modern times. Blocking or attempting to prioritise data streams would destroy an open Internet.

What emerged most clearly was a disconnect between the Internet as a business and the Internet as a commons, both of which are necessary aspects of maintaining the most remarkable technological network ever built.

Freedom and openness encourage innovation and enterprise, the type of thinking that builds businesses like cunning VOIP systems as well as the deeply complicated code that expands and enhances our experiences on the web.

But business decisions drive the expansion of the hardware and infrastructure that’s needed to support an ever expanding and apparently insatiable need for bigger pipe for data streams and faster data packet responses to enable interactions that drift ever more inexorably toward real time.

It’s notable to remember that the single largest jump forward taken by Internet backbone technology was funded by the Dot Com Bubble, a time of wild and unsustainable spending on anything Internet that resulted in a massive buildout of backhaul architecture.

One contributor from the floor at the net neutrality discussion suggested that it might be best to “let the market decide.”

But this is one situation in which the market may not be sufficiently informed about the issues as it needs to be in order to arrive at the right decision.

The issue of net neutrality isn’t as simple as whether Digicel should be excoriated for taking drastic action to preserve revenue from its long distance calls or sneered at for not being more digitally enterprising in its approach to solving the problem.

It certainly isn’t going to be solved if service providers can’t engineer a business model that allows them to attract returns on their investments.

It is, in short, a
tragedy of the commons, with both service providers and customers seeking their own interests in a technology that was designed to facilitate the free transfer of information and which has proven resistant to efforts at monetizing old business models when they are transferred into bits.

Last Wednesday’s civil society discussion on net neutrality produced no answers, but raised a lot of questions that demand clarification. That kind of understanding won’t come if businesses cloud the discussions with PR driven obfuscations and users respond with perspectives inflamed by emotion.

A growing, thriving Internet must be paid for, but that coin is no longer denominated only in cash. Attention, access, reliability and excitement are growing currencies being traded in bitspace and everyone has to become both more familiar and more courageous about leveraging them to advantage.

BitDepth#950 - August 19

The Pagliacci syndrome
Anthony Seyjagat photographed in costume, 1990. Photo by Mark Lyndersay.

Robin Williams is dead. There. It's said.
I did not know the man, and even though he gave of himself in an abundance of riches in every medium he touched, I never confused those pleasures with any sense that I understood who he was.
At least part of that is because of Anthony Seyjagat, a talented physical actor, mime (of all things), and an awesomely effervescent personality.

Anthony did traditional mime performances, but then he took it to another level in a seriously disturbing suite of pieces performed with Penelope Spencer at Raymond Choo Kong’s The Space at Bretton Hall.
Even after I’d begun wandering away from my ten-year flirtation with the local theatrical community, he kept in touch, constantly trying to get me involved with one project or another.

During that time, he also tried hard to mend a rift I'd managed to engineer with Raymond some years before.
After Anthony abruptly committed suicide, it became clear that he had spent weeks calling and visiting people having, what we all realised in retrospect at his wake, were not pleasant, out-of-the-blue chats but final conversations.
I was angry after I’d heard the news of his passing and when I was asked to speak at his funeral, I said so. Ultimately, we didn’t know Anthony Seyjagat at all.

When I heard that Robin Williams was dead, most likely by his own hand, I didn't think of Mrs Doubtfire or the Genie.
I thought of Anthony perched on a thin ledge, braced against a second story wall, for a promotional photo for the Baggasse Company’s Children's Storyworld.

I also thought of the unkempt bush around the rectangular mound in an Arima cemetery the last time I visited his grave two decades ago.
Creating is hard. It's so much easier to do work that passes muster than it is to do something that challenges or even defies expectations.

Being funny is hard. You can't really be funny without being a bit wicked and the best humour is downright nasty. Every joke has a butt and it’s often roundly kicked.
In the early 1980’s I wrote my first produced play, after a fashion. I wrote a play called Sno Kone and the Seven Douens as a possible sequel to the smash Christmas production that Helen Camps’ All Theatre Productions had staged the previous year, Cinderama.

It wasn’t really a sequel. It was a black comedy and very little of my original script made it to the stage after it had been workshopped with the cast and Roger Israel had written the music, just the general outline, a couple of songs I wrote lyrics for at the last minute and a few bits of dialogue here and there.

It was a dismal failure. People walked out halfway through, deeply offended at its bleakness. The ones who stuck it out to the end got a fake newspaper celebrating the death of all the heroes complete with a very Randy Burroughs kind of photo of dead bodies lying at the feet of the law.

It's impossible to create anything of any value without leaving some skin behind, and good comedy demands a regular pound of flesh from its author.
In a country with a distinctly immature funny bone, satire gets treated like gospel truth and bawdy fun rules.

Happiness isn't a default for most of humanity. It's something precious that must be continuously earned. I experience it as a frisson of pleasure on occasion, like a cool breeze aberrantly wafting through a stifling and damp mineshaft.
There is a price for seeing the world as it is, and the fee rises as you choose to share that understanding with increasing honesty.
People who do so tend to self-medicate, either to blunt their perceptions or worse, the consequences of expressing them.

There’s an old joke told about the protagonist of the opera I, Pagliacci that’s retold with brusque irony in
Alan Moore’s Watchmen, a bit of harsh wit that hearkens to the final line of the opera, “La commedia è finita!”
Of all those who dare to return with dispatches from the front lines of reality, funny people are the ones most at risk, because the reality they mine ruthlessly is often their own.

If art is truth reinterpreted, then the best comedy is the most dangerous of fun house mirrors, the reflection that is both honest and surreal, the guffaw that catches in the throat sourly.

BitDepth#949 - August 12

Android phones ship with with a wide range of software, but some of it is hardly best of breed. These five apps are. Click here to read more...

BitDepth#948 - August 05

Jeffrey Alleyne's successful film about ghetto life in T&T gets widely pirated. This is the story of the making of the film and its widespread theft. Click here to read more...

BitDepth#947 - July 29

TSTT's acting CEO George Hill explains the company's status and strategy for its five year recovery plan. Click here to read more...

Two days of copyright talks later…

Reporting for the Trinidad Guardian on two days of copyright discussions related to Carnival. Click here to read more...

BitDepth#946 - July 22

Two days of deliberations on matters related to copyright in Carnival prompt strong responses from its stakeholders, including this author. Click here to read more...

BitDepth#945 - July 15

Facebook quietly allows a University research team to influence the feeds of the social media website. Here's why that shouldn't surprise you. Click here to read more...

BitDepth#944 - July 05

DigicelTT announces a ban on Voice over IP software, prompting a national discussion on net neutrality. Click here to read more...

BitDepth#943 - June 30

The National Carnival Commision reveals the results of their analysis of Carnival's traffic and parade flows and preliminary proposals for the parade route. Click here to read more...

BitDepth#942 - June 24

An explanation of BitCoin and its likely applications to e-commerce. Click here to read more...

BitDepth#941 - June 17

Accvent introduces a line of mobile and computer accessories to the Trinidad and Tobago market. Click here to read more...

BitDepth#940 - June 10

Apple previews a new version of its flagship operating system, Mac OS Yosemite. Click here to read more...

BitDepth#939 - June 03

For today's web publishers, Wordpress is the overwhelming choice. My experiences with the content management system in creating a new website. Click here to read more...

BitDepth#938 - May 27

Steelcase introduces a new chair for the tablet and laptop user. Click here to read more...

T&T a key market for Microsoft

Microsoft's Vice-president of the Sales, Marketing and Services Group, Barry Ridgway explains his strategy for the region. Click here to read more...

BitDepth#937 - May 20

Nokia disappears into the portfolio of Microsoft and a bold and innovative company that once ruled the business of cell phones pays a heavy price for not keeping pace with the changes in the market. Click here to read more...

BitDepth#936 - May 13

Three hard drive crashes in two weeks prompt a review of my backup regime and the hardware supporting it. Click here to read more...

BitDepth#935 - May 06

A team of young developers from the University of the Southern Caribbean win a Microsoft software development challenge. Click here to read more...

BitDepth#934 - April 29

A shockingly violent video of corporal punishment sets the Internet on fire in Trinidad and Tobago. Some thoughts on the debate. Click here to read more...

BitDepth#933 - April 22

More impressions of the Samsung Gear Fit and all the ways it both succeeds and falls short of its potential. Click here to read more...

BitDepth#932 - April 15

First experiences with a smart watch, the Samsung Gear Fit. Click here to read more...

One. Then five to fifteen.

Talking to Carnival students at UWI on a new media panel, I offer some perspectives on the status of the festival in this talk and explore the issues that are likely to shape its future. Click here to read more...

Carnival conversation

Carnival always seems to be on the verge of Click here to read more...

BitDepth#931 - April 08

Samsung's announcement of not just a new smart watch but also a new OS for one of its new devices points to some new strategy from the Android market leader. Click here to read more...

BitDepth#930 - April 01

Microsoft announces a version of Office for the iPad. Click here to read more...

BitDepth#929 - March 25

As Carnival comes to an end, a national conversation about the festival still looks back at the event's past. Click here to read more...

BitDepth#928 - March 18

Putting an Android powered camera, Samsung's EK-GC100, to the test on Carnival Tuesday. Click here to read more...

BitDepth#927 - March 11

What a day at the Socadrome taught me about the future of Carnival. Click here to read more...

More transparency in Carnival

My editorial for the Guardian for March 10 calls for more transparency in the operations of the state agency responsible for convening Carnival and its most senior stakeholders. Click here to read more...

BitDepth#926 - March 04

Carnival in Trinidad and Tobago should be much further along than it is today. Some thoughts about why that's the case. Click here to read more...

The Geography of Carnival

Editorial for the T&T Guardian written on March 02 on the controversies surrounding the routes taken and planned for Carnival 2014. Click here to read more...

BitDepth#925 - February 25

Leslie-Ann Boiselle, BC Pires, Dean Ackin, David Rudder and Kenwyn Murray all have a stake and interest in Carnival. These are their thoughts on how Carnival might be improved. Click here to read more...

Elitism or entrepreneurship?

Editorial written for the T&T Guardian for February 26 considering the implications of the Socadrome and its potential impact on Carnival 2014. Click here to read more...

BitDepth#924 - February 18

Copyright issues arise again during Carnival 2014 with no apparent solutions or common sense evaluations of the actual law in sight. Click here to read more...

What the NCC should do

On January 23, I responded to a request from the NCC asking for suggestions on media accreditation and handling. This is the document I supplied to them and later in the season, to the management of the Socadrome. Click here to read more...

Zorce on accreditation

A letter from Zorce boss Narend Sooknarine about his experiences applying for Click here to read more...

Yooz seeks more for its users

Reporting for the Business Guardian on Yooz, an electronic payment system that works on all phones. Click here to read more...

BitDepth#923 - February 11

Facebook's new post distribution algorithm creates problems for publishers and marketers. Nicole Phillip Greene of offers some solutions and approaches to handling the issue. Click here to read more...

BitDepth#922 - February 04

TSTT introduces its first Gigabit Community to a gated housing project in Chaguanas. Click here to read more...

The Gigabit Community

Business Guardian news reporting on TSTT's Gigabit Community project. Click here to read more...

BitDepth#921 - January 28

Apple's Macintosh, 30 years later. Click here to

BitDepth#920 - January 21

Microsoft introduces CityNext, which marries the social engagement of Facebook and Twitter with the power of Open Data's ability to improve governance. Click here to read more...

BitDepth#919 - January 07

A few words of respect about Therese Mills on her passing. Click here to read more...
Web Analytics