The Outer Banks has a way of connecting with people. Uncrowded beaches that stretch for miles and the endless possibilities that come with them. This is America’s First Beach. Forget Hawaii and California – on the East Coast, the best place for surf is on the Outer Banks, and this region is renowned as one of the best surfing destinations from New York to Florida. Surfers from all over the country and the world flock to the Outer Banks for the annual ESA tournament, or just after a storm swell, to paddle out to the Atlantic and enjoy some of the best waves on the coast and a website as a local online connection to surf in the Outer Banks not only required an intuitive user experience, but a strategy centered around meeting every surfer’s need and a powerful back-end engine to deliver third-party content across the site.

For over 17 years Right Coast Surf, LLC provided Surf Reports on their OBXSurfInfo.Com website until July 3, 2018 when they released their new version 2.0. Their local approach to all things surfing remains the foundation of the site with more toes in the sand surf reporters, surf forecasts & webcams than ever. Top that with locally fueled surf blogs highlighting past swells and other local events. With this release they roll out a “Premium Member” option offering users an improved experience thru a new report design and navigation. People will also get reduced advertising, unlimited streaming access to their 6 oceanfront webcams, access to exclusive 5-day surf forecast that displays multiple swells, wind and tide conditions in over 10 locations from Virginia Beach, VA to Wrightsville, NC.

The new version of this website was about 40% done before Mitro Digital Marketing Team took over it. We’ve got the most challenging part. Hiding the latest three days on the Forecast on the free version, fixing the issues with the Forecast’s tooltip, outputting the Forecast Data on the Report’s Submenu, Weather Alerts, Nearby Reports, adding new cams, Photo Gallery, secure their online payments, setting up other third-party APIs like Buoy Reports Provided by NOAA, etc.

Surf Report

Initially the website was running on WordPress, so we continued developing the project in the same boat by building a new Plugin and writing a lot of JavaScript in the front-end and some PHP in the back-end which outputs the sprinkle you see on their menu.

We also used some PHP code for their Surf Reports, which are published by the team of reporters. These reports are a separate post type (surf_reports) and are being unpublished automatically in the next day at 5AM. This functionality allows them to keep the most updated content on their website.

Let’s take a detailed look at this simple snippet:

So, first we need some arguments to select the posts made in the last 24 hours:

$args = array(
'post_type' => 'surf_reports',
'posts_per_page' => '1',
'category_name' => $cat,
'date_query'             => array(
array(
'after' => '24 hours ago'
),),);

 

On the next step we have the query which will select the posts according to the arguments above:

$surfquery = new WP_Query($args);

 

Bellow we will have an “IF” statement which will check if there are any posts fund in the database and while there is one or more then we output its data between HTML tags:

if($surfquery->have_posts()):
while($surfquery->have_posts()) : $surfquery->the_post();
// get the post date
$exp_date = new DateTime(get_the_date('Y/m/d h:i:sa'));
$exp_date->modify('+1 day');
$exp_date->setTime(04, 59, 00);
// set the correct timezone
date_default_timezone_set('America/New_York');
// get today's date
$today = new DateTime();
// compare to see if the post is expired or not
if($exp_date->format('Y-m-d h:i:sa') < $today->format('Y-m-d h:i:sa')){
// Update post
change_post_status(get_the_ID(),'trash',$cat);
break;
}

 

The HTML part is the last step. Here we just output the data:

<div class="report_body">
<a href="<?php echo site_url('/team/'); ?>">
<div class="author_avatar">
<?php echo get_avatar(get_the_author_meta('ID')); ?>
</div>
</a>
<div class="entry"><?php
$fname = get_the_author_meta('first_name');
$lname = get_the_author_meta('last_name');
echo $fname . ' ' . $lname;
?>
</div>
<div class="entry">Last Update: <?php the_time( get_option( 'date_format' ) ); ?>
<span class="time" style="display: none;">Time: <?php the_time(); ?></span>
</div>
<div class="entry">Time: <?php the_time(); ?></div>
<div class="content">
<?php the_content(); ?>
</div>
</div>

 

Weather Alerts is another snippet we can talk about. We used weather.gov API to get the Alerts. As we already know, in PHP we should initialize a cURL session before calling an API.

 $ch = curl_init(); 

 

After that we assign the header including a User Agent:

$header = array('HTTP_ACCEPT: application/atom+xml;version=1', 'Cookie: foo=bar\r\n', 'User-Agent: Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B334b Safari/531.21.102011-10-16 20:23:10\r\n');

 

And then bellow we call our API:

curl_setopt($ch, CURLOPT_URL, "API link here"); # URL to post to
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1 ); # return into a variable
curl_setopt($ch, CURLOPT_HTTPHEADER, $header ); # custom headers, see above

 

We get the data in a json format and now we should decode it using json_decode which is a PHP native function. This function will transform the json data into an Object. Having the object, we can parse it and check for alerts. See the code bellow:

$result = curl_exec( $ch ); # run!
$result = json_decode($result);
curl_close($ch);
if(substr($result->features[0]->properties->effective, 0,10) == date("Y-m-d"))
$content = '<div class="weather_alert">' . $result->features[0]->properties->headline . '</div>';
else
$content = '';
return $content;

 

The list of third-party data integrations was by no means short. This made the planning stage all the more important, and by doing our tech spec homework, we were well-prepared for every site feature and custom functionality heading into the development stage of the project.

Being hosted by WPEngine the going live process went smooth as long as we were helping WPEngine team support to find solutions for their arising difficulties while deploying the website from the staging area to Live. WPEngine is a good platform for simple projects as their support team doesn’t have much experience in working with complex projects and because they don’t have the right tools for a proficient web developer to get his job done working on that platform makes it very difficult.

But in the end, everything went well and the website wen Live. This hard work and client collaboration extends beyond the site functionality — we’ll be creating more tools to make every site visit engaging and interactive.

We don’t create sites to be perfect, we create them to be perfected. That’s why our relationship with Right Coast Surf, LLC relies on an ongoing partnership to push for better results.