Peter J. Hanson - Establishing Ability and Credibility
Seeking Chief Architect & Senior Software Developer Opportunities

I'm technical and easy to work with.
20+ years of technology experience.
Resume

Introduction
I'm interested in helping you navigate technical complexity to solve your problems and see your ideas evolve into working solutions. I have over fifteen years of experience in designing and rolling out custom enterprise and web-based solutions as well as providing support, upgrades, and consulting on existing systems. I have served the following markets and business models: design, startups, auction operators, communications, sales, housing, dealerships, entertainment, healthcare, professional services, publishing, manufacturing, engineering, retail, educational, non-profits, and government organizations.
Technically, I am known for my web services, database, and architecting experience. Specifically, I am known for my experience in PHP, SQL, JavaScript, XML, Apache, and Amazon Services. My work is more on the engines of websites than the look and feel. From a soft skills perspective, I am known for my communication, commitment to the customer, and my ability to creatively and accurately translate business needs into working software.
My areas of expertise include systems analysis and design, interface design, rapid application development, infrastructure analysis and maintenance, implementation rollout and support, and systems administration. I have worked in these capacities for many organizations across the world.
What follows introduces my years of experience at-a-glance, testimonials and letters of recommendation, previous work, code samples, technical skills, information you may find relevant, personal information, and my final thoughts. If you find that I've not written about something that you'd like to see, I hope you please let me know, and I'll get it to you right away.
Years of Experience, Core Strengths, & Education At-a-Glance
- Technical Solution Design:
I have been working with customer teams since 1995 to deliver technical solutions. I focus on systems using PHP, SQL, AJAX,
Apache, and Amazon's Services. Due to my near exclusive focus on LAMP-stack (LEMP too now) development since 2001, I have developed skills
that I believe enable me to contribute to any LAMP application, whether it be maintenance, creation, or consulting.
- Software Development Team Management & Leadership: Since 2001, I've led a development team within Up And Running Software - custom software development.
- Software Development Experience
- Programming: since 1994.
- SQL and Database Design: since 1997.
- PHP: extensively since 2001.
- JavaScript and XML: extensively since 2003 (NOTE: AJAX stands for Asynchronous JavaScript and XML; thus, I've been using the building blocks of AJAX since 2003.)
- Supporting Web Languages (CSS, HTML, etc.): I've been using most of these since they were created.
- Apache & Linux: since 2001.
- Amazon Services: since 2006.
- Core Non-technical Strengths: Translating business requirements into working systems that add value, understanding the importance of communication, believing in the balance of pragmatism and perfectionism in programming, and taking a personal interest in your success.
- Education:
Most of my software development and management knowledge has been through self-learning via experience
through application and reading online and print materials. However, I do have a B.S. degree in
Management of Information Systems from Michigan Technological University
(MTU). I also nearly have the credits to have a Mechanical Engineering degree as well. It's a well-known university in the engineering community, and such technical companies as Google, Microsoft, and IBM recruit MTU graduates. GE, GM, Ford, Dow Chemical, Boeing, and many other multinationals recruit MTU graduates as well.
As another means of self-learning, I really enjoy exploring new frameworks for software development processes and talking with and learning from people who have worked in the trenches for some time. To that end, I earned my certification as a ScrumMaster. I also read a good deal online about software development processes, with Scott Ambler’s writings being some of my favorite material.
Certification Link on Scrum Alliance’s Site
I like to stay in the details too, and in support of that I am PHP Zend, Magento, and AWS Certified:
Certification Link on Zend’s Site| Certificate Confirmation Letter | Certificate
Certification Link on Magento’s Site
AWS Certificate

Testimonials and Letters of Recommendation
Communicating Success from Previous Clients' Perspectives: here are several testimonials and letters of recommendation from clients of systems where I served as Chief Architect and led the Development Team. With the following, I'd like to focus on some of the intangibles that Up and Running offers in terms of how we do the work and how we build relationships with our customers. Here are the summaries and links to the full letters: 












Testimonials
- Francois Groleau
Senior Software Engineer at Yahoo
"Pete has provided us with strong technical expertise and leadership in numerous projects. It has been very easy and satisifying to work with him and his team as they always managed to provide us with on-time solutions, both from a technical and resources standpoint." - Mike Sherman
CTO at Edgecase (formerly Compare Metrics)
"Pete’s technically competent, and he’s easy to work with. Examples of the “easy”: able to talk with stakeholders at all levels professionally, is available when we need, is responsive, gives out his cell phone, and keeps going until we’re happy. Technically, he was able to work within our big data environment (over a billion transactions daily) like he’s been part of the team for years. Specifically, he and his team accomplished the following:
- Defined and created unit and functional tests around a key client-side interface that funnels all requests and responses from the recommendation engine to each client's site. They also refactored the code base, utilizing a dependency-injection approach to handle specific customer exceptions.
- Created a tool set for use on third-party websites to determine and test if a particular client-side library was implemented and running optimally. This flexible tool saved us a lot of time, and ensured consistency and quality in the software's application across a large volume of sites.
Pete and his team are a valuable addition to any organization that needs regular help or just some scale once in a while." - Louis Rosenfeld
Publisher at Rosenfeld Media, LLC
"We've had the Up and Running team helping us for many years. Every UAR interaction we've had is positive, and the staff is composed of true professionals--conscientious, capable, reliable, and very easy to work with. I highly recommend Pete and his entire team." - Brian Merrell
Director of Technology at Backstop LLP
"We've worked with Pete and one of his developers for the past year and a half and have been quite happy with the results. Pete has that real enthusiasm for technology that another techie instantly recognizes. He communicates well, knows his stuff and delivers quality results on time. I am happy to recommend Pete and his company." - Aurangzeb "Zabe" Agha
Founder of Metrical, Inc.
"To call Peter a master of his trade would be a disservice to him. Not only is he more than an expert in his field, but his skills are so numerous that he falls outside the norm of excellent practitioners I have worked with for almost two decades. Peter exhibits, demonstrates and delivers a technical sophistication that is a work of a master craftsman par excellence. As soon as I met Peter, I knew he was who I wanted to work with. Immediately, his ability to clearly communicate his thoughts and articulate his analysis of a complex problem was apparent. Having worked with him for a few months now, I can say that I made the right move: On occasions too numerous to count, Peter has demonstrated his ability to think outside the box to solve complex issues and to deliver results that have made a difference. He can work alone or as part of a team and in every scenario, he delivers. Peter’s knowledge of web application development and specifically the ability to iteratively grow a product by first starting with the MVP—minimum viable product--and then continuously refining feature-by-feature was key for my startup to not only get traction but save costs. Iterative, Agile processes allowed us to see an early version of the concept, keeping us from pouring money into a full-blown estimate-driven project. In this way, in addition to my own gut about Peter, I was able to see how things were moving forward for the first couple of weeks to further reassure myself that he and his team were what I was looking for. Aside from his technical profundity, Peter is a fantastic person. Delivering top quality (customer) service, working side-by-side through stressful situations and being a kind, polite and affable person are his hallmarks and I’m thrilled to know him and to have the pleasure of working with him.
I would unquestioningly recommend Peter—and have on multiple occasions!" - Jay Gordman
Chief Momentum Officer at AccuQuilt
"Pete and the Up and Running team have been a pleasure to work with. I have worked with his team on a few different Magento implementations the past several years. They have a deep understanding of Magento which allows them provide outstanding solutions. They are head and shoulders above the rest." - Nikhil Jain
Founder of Nookster
"Pete is the guy every CEO/manager should have before embarking on any IT project. Pete is an excellent choice for the entire life-cycle of software development. He is also a very ethical guy who delivers above and beyond. Pete stays on top of every project and manages it succesfully to the very end. Customer satisfaction is his highest priority. His ability to "roll-up-his-sleeves" and jump in is a great asset to have too. Pete has also become a dear friend over the years. I highly recommend Pete and his firm." - Michael Ormsby
CEO at Essential Education
"In 2013 we turned over all of our software development to Up And Running. Their teams and project managers are the best we have ever worked with. Development gets done on time and exceeds our expectations. Communication is easy and they continue to come up with elegant solutions to our problems. I highly recommend Pete and Up And Running Software." - Deb Shelby
CEO at Interactive Ensemble | Primary client served was Mars International
"Pete and his team are great both in a "crisis" as well as strategic planning for long-term projects. I have worked with Pete and his team in both modes and have been pleased with the results. Pete is a great listener and provides guidance on technical recommendations with easy to understand explanations of what the technical choices are and why one may be a better solution than another. I have enjoyed working with Pete and look forward to doing more projects with him and his team." - J Schwanke
Author of "Fun with Flowers” and Owner of uBloom.com
"Up and Running provides Amazing Service. I've worked with many computer companies for decades... and the Experts at Up and Running are truly Smart, Professional and Extremely Fast...I'm ALWAYS impressed by their timely and accurate assessments... that prove out in execution. I recommend them highly. Look for Further... if you want to get "Up and Running" quickly, efficiently and effectively... Enlist their Service!!!" - Chris Cordray
Founder of and CEO at Opsfire LLC | Primary contact at ScienceLogic
"Pete is an outstanding manager with excellent technical and customer skills. A great communicator, Pete delivers projects on-time while closely working with clients to deliver results that surpass expectations." - David Ciccarelli
Founder of and CEO at Voices.com
"Peter of Up and Running provided my company Voices.com with reliable and consistent service for a number of years, enabling us to rewrite our site from the ground up. Working with Up and Running was key to our growth and assured us the comfort of having web developers at the ready during a time when we didn't have our own developers on staff and required outside support. I recommend both Peter personally and his company Up and Running." - Kelly Thompson
President at Encompass Meetings
"Peter and Up and Running Software is fantastic to work with. They took our system for entering field-based meetings to levels beyond anything that we could have imagined. Any problems/upgrades that we come to him with are greeted with a better solution than we ever could have thought it would be. They rock!"
Letters of Recommendation
- Artsopolis' Letter of Recommendation
- Artsopolis has 24 community sites in its network, with an average of 2.5 million unique visitors per month. They can create a new network website in less than four hours now, a 2,000% improvement. Jeff Trabucco, Director of Artsopolis Network, considers us a part of the organization he leads, and kindly writes, "Up and Running has also helped us shape our business strategy and processes. We feel comfortable discussing vision, mission, strategy, and tactics at the business level and the business operational level. In a sense, I consider them my CIO in addition to my development team."
- Portage Health's Letter of Recommendation
- This relationship has resulted in a beautiful website (Portage Health
) and an equally beautiful architecture that drives Intranet and Internet operations using the same data. Karin Van Dyke, Vice President of Portage Health, said some kind things about us in this letter, including, "What impressed me most about Up and Running's approach to our project was their vision and leadership, commitment to the customer, insistence on setting realistic goals with succinct timelines, organization and project management, education and coaching."
- Paragon Business Solutions' Letter of Recommendation
- Karen Hamilton runs a service organization called Paragon Business Solutions, Inc
that helps companies improve their quality and achieve/maintain quality certifications such as ISO 9001, ISO 14001, and BSI 18001. Since Karen has a high standard for quality in processes, I'm very pleased to have received her recommendation. Her letter notes that, "We have been working with Up and Running for almost two years. During that time they have shown to be extremely competent professionally, and, just as important very customer focused." She also cites an example of where we go beyond the scope of work to troubleshoot and correct a problem with the end customer's IT infrastructure that was impacting the project's success.
- Advancia Corporation's Letter of Recommendation
- I believe this project highlights our customer commitment and mentality to do what it takes to ensure the customer is successful. We were asked to help support them onsite from a software perspective, and we ended up helping them in many other areas outside the scope of our duties, including printing batch runs into the early morning, configuring and supporting a LAN and wireless network, and providing direct customer support to our customer's customers (event goers).
- Rotary's Letter of Recommendation
- We served three of the districts in one of the largest service organizations in the world, Rotary. This sentence says a lot, "The product was great, but I think it's the dependable service, flexibility, and cooperativeness that really sets Ian and Up and Running apart." (Ian McKilligan is my business partner, and manages the operations of the company. I led all the development for this project.)
- Physician's Insight's Letter of Recommendation
- I'm proud of this solution because we were the third software provider David and Carol contracted with for this work, and we accomplished much more and at a lower cost than the preceding companies did. Through our work I think we reestablished their trust in the software development community, which is another accomplishment because I dislike when others tarnish our industry's reputation. Also, it's gratifying to read their comments that we delivered a system that works how they want it to. This is extremely pleasing to read given that we were under budget and on time, making this project a big success in the customer's and our eyes. I attribute this to our agile, personal style of developing software, and I believe it's resulted in a relationship where we're a trusted partner with Physician's Insight.

Previous Work: sites I've personally developed, managed, and implemented:
Public Sites
Here are some quick links to sites I've implemented or helped with:
- http://voices.com
(Lead Developer, back-end and front-end work; no graphics design)
- http://artsopolis.com
(Contributing Senior Developer, back-end and AJAX work)
- http://monomachines.com
(Contributing Senior Developer, back-end work)
- http://sendpepper.com
(Contributing Senior Developer, back-end work)
- http://emailinstitute.com
(Contributing Senior Developer, back-end work)
- http://portagehealth.org
(Lead Developer, back-end and front-end work; no graphics design)
- http://nemesisinteractive.com
(Lead Developer, back-end and front-end work; no graphics design)
- http://oilandgasinvestor.com
(Contributing Senior Developer, back-end work)
- http://remedylife.com
(Contributing Senior Developer, back-end work)
- http://sportscardigest.com
(Contributing Senior Developer, front-end work; no graphics design)
Most of my work is on the programming of Intranet sites or administrative features of public websites. Before there were other good frameworks out there, I created a framework that has all the features as described in the PDF document titled, Up and Running -- Complete Listing of UAR ABLE System Features.pdf. This framework supports over 30 customer websites currently. This includes over 14,000 man hours of work. It handles administrative operations, provides database transparency, manages data relationships, administers user management, and facilitates code reuse. I no longer recommend our framework for use today for anything other than very custom solutions that wouldn't benefit from using community-driven frameworks. I remain quite happy with the strong frameworks and platforms supported by excellent communities, and recommend these to our clients. Here are some customers who use or have used my framework:
- One is at GS Engineering
. They use my product to manage all their projects, track all billable time, manage payroll, and more.
- Another is at Physician's Insight. I developed a radiology management website that allows for easy communication between physicians and radiologists.
- The internal workings of Portage Health are notable.
- I developed a website using Perl to manage Little Caesars Amateur Hockey League's
teams, statistics, and scheduling. This is the largest amateur hockey league in the world.

Code Samples: most are PHP and JavaScript-focused. If you’d like to see others, please just email me.
PHP Code Samples
- Ebook Extractor Class and phpDocumentator ( 2013 | source code - text file 1 ):
These two classes deal with basic operations against eBook files. The extractor works with ePub ("a free and open e-book standard by theInternational Digital Publishing Forum" source: http://en.wikipedia.org/wiki/EPUB
) files and is designed to deflate and inflate the files. When an ePub is inflated, it's presented as a folder structure of HTML documents that represent the book. The text replace class ( 2013 | source code - text file 2 ) is just a tool to quickly go through the HTML files and do tag replacements. In this manner, an eBook can be customized for the individual purchaser at the time of the download request. These make use of the phpDocumentator syntax for file and method headers.
- Laravel Assignment Controller, Fat Model and Skinny Controller Pattern ( 2013 | source code - text file ): This is a controller from an English-as-a-second-language (ESL) class application. The controller is responsible for coordinating work on creating assignments, previewing them, sharing them, submitting work for them, and dealing with reviews of the assignment submissions. It is a Laravel controller that offloads most of its work to the model layer using the Fat Model / Skinny Controller pattern.
- Bootstrap, Symfony, phpUnit - Testing Environment Setup and Composer Autoloading ( 2014 | source code - text file 1 ):
I created a SQL patch management system (full package: https://github.com/petehanson/dbpatch) that is used on quite a few projects. I've given it to those I work with, and it's been well received because it helps the team work a little more productively. This tool is responsible for tracking which patches have and have not been applied to a specific working copy database. The patches that haven't been applied are then executed in order, via the prefixed timestamp (which is normally normalized on UTC). In this example, there's the Util class ( 2014 | source code - text file 2 ) and the PatchManager ( 2014 | source code - text file 3 ). There are also accompanying phpUnit tests ( 2014 | source code - text file 4 and source code - text file 5 ) that go with those classes and the Bootstrap file I use with phpUnit to help set up testing environments and perform the composer autoloading. PatchManager deals with the organization of patches. It gets a list of applied patches from the database, and then determines the patches that are on the file system. It then determines which patches haven't been applied yet. It can also take in filters to trim down which patches do get applied if they haven't been already. Util deals with a couple of specific path-based operations. One expands out "." and ".." relative references to create a canonical folder path without the use of realpath(). The other operation searchs for a file using a recursive iterator to process the folders. The tool utilizes Symfony's excellent console component to define the running parameters and all the options the various commands support.
- entityrelationship.php ( 2008 | source code - text file ): This provides a generic association model for any two entities in one of the systems I developed. An example usage is associating a person/contact record to a system user record or an address record.
- file.php ( 2008 | source code - text file ): This is a file abstraction that represents a file on the file system. It provides convenient methods for accessing various properties and can tie into other parts of the framework.
- objectparser.php ( 2008 | source code - text file ): This is a recursive parsing routine that uses reflection to analyze an object to generate a list of values from that object that can then be indexed through my search engine.
JavaScript Code Samples
- AngularJS, Node.js, REST API File Upload ( 2015 | source code - text file ): This is an AngularJS directive that helps handle custom image and video uploads in a Photo Tour application. The directive is used for associating a form select control with the file input field. It uses a REST API to handle the file upload, and ng-file-upload is used to assist with the uploading work. There is a Node.js service that handles the REST API and uploaded file.
- Node.js and Sockeds.io ( 2014 | source code - text file ): This is a Node.js server that handles device registrations for answer and presenter devices in support of an Audience Response System (ARS). It uses socket IDs to track registrations and also which program code is being registered for. It tracks which question the presenter is currently navigating to, pushing updates to the device clients so they can refresh their displays with the new question and its available answers. This uses the sockets.io library from NPM (Node.js) to handle the communication between devices. Answer devices are typically mobile devices (this system is often used at bring-your-own-device events) and tablets.
- Canvas for Dynamic Drawing of Regions ( 2015 | source code - text file ): This demonstrates the manipulation of a canvas. Part of the tour application where it is used allows for the dynamic drawing of regions in the tour that can overlay context. This is a demonstration of how that canvas manipulation will take place. It draws dots, lines, and arcs, and connects the dots into a solid Polygon.
- Digium Phone Application ( 2015 | source code - text file ): This is an application designed to run on the Digium phone V8 platform. This VoIP app watches for outgoing calls, uses a long polling system (since the platform doesn't support sockets) to keep a constant connection to a centralized service (its registration process), listens for alerts, sends alerts based on pattern matching (not feature complete yet), and executes a visual display on the phone when an alert is received.
- Jasmine Account Manager and Object Tests: These are two Jasmine tests created to verify functionality of the Digium phone app service. Jasmine Account Manager and Object Tests ( 2015 | source code - text file 1 ) checks some of the features for the account manager and account objects. It uses the jasmine-node module installed from NPM (Node.js) to execute the Jasmine tests. Jasmine Server API uses Frisby ( 2015 | source code - text file 2 ), a REST API testing framework built on Node.js and Jasmine, to do tests against the API endpoints available in the Node.js process. It also uses jasmine-node to execute the tests.
- jQuery Parallax ( 2015 | source code - text file ): This demonstrates a parallax effect on three objects, one in the front and two behind it, on both sides. The three objects move, depending on a mouse position over them, and will create a parallax effect on the page scroll. The module also handles resizing of the window.
- AngularJS Virtual Keyboard ( 2014 | source code - text file ): This example demonstrates some AngularJS directives that deal with rendering a virtual keyboard on the user interface. It is used to allow users to set shortcut keys for operations in the application. The keyboard will also show what's been mapped so far. Here is an image of the interaction (not the end-state user interface): Virtual Keyboard Screenshot.
- Node.js, MongoDB, Mongoose, Google Maps Overlay ( 2015 | source code - text file ): This shows a Node.js model that handles MongoDB interactions through the module Mongoose. It's a component that allows storing and fetching of the metadata about a tour record, which is used for the overlay on Google's street view. This also uses promises to help govern what should be returned.
- AngularJS, Amazon Elasticsearch, Search Tree ( 2014 | source code - text file ): This is a series of AngularJS directives that deal with building out a set of conditions that govern how an Elasticsearch query will run. There are a number of input types (examples: tags, values, and weights) that can be defined. The tool supports a graphical structure so it's easier for someone not familiar with query languages to construct a query. Here is an image of the interaction (not the end-state user interface): Search Tree Screenshot. Here is a sample of it in use (within code, it doesn’t render anything): Search Tree HTML.
- Amazon Elasticsearch Parse Search Syntax Creation ( 2014 | source code - text file ): These are a set of functions that work with the previous code sample (AngularJS, Amazon Elasticsearch, Search Tree). These methods are used to turn the graphical query conditions into a syntax that Elasticsearch can work with.
- jQuery, API Extension Search Execution, Executed within WordPress Page ( 2014 | source code - text file ): This contains functionality for executing the API extension off of a customized search solution. It is executed from within a WordPress page and is reliant on jQuery. This is not a complete code sample.
- State Machine Formatter - Example Run ( 2013 | source code - text file ): This implements a basic state-machine used to parse a chunk of text and attempt to force it to conform to a specific syntax. The syntax is a custom-defined format for recording completed work items. This library is called as the user is typing into a form in the application. It takes the text entered by the user, compares it to the reformatted code, and then displays the differences as recommendations to the user. State Machine Formatter - Example Run shows a video of the functionality in use.
- jQuery and the jQuery Flot - Time Graph ( 2013 | source code - text file ): This uses jQuery and the jQuery Flot graphing, plotting library to draw a graph showing the hours billed by a particular employee aggregated on a weekly basis. To improve the initial load time of the graph and spread server load, this code uses AJAX to progressively load the data for the graph one week at a time. The graph is updated as each chunk of data is retrieved from the server. jQuery and the jQuery Flot - Time Graph - Example Run shows a video of the functionality in use.
- Phone Application, Cordova, AJAX ( 2014 | source code - text file ):
This demonstrates the use of AJAX to retrieve data from an API in an Apache Cordova (from http://cordova.apache.org
, "set of device APIs that allow a mobile app developer to access native device functions such as the camera or accelerometer from JavaScript application").
- JavaScript, DataTables, Laravel, Blade ( 2014 | source code - text file ): This uses information from an API and dynamically adds rows of data to a dynamic table. Laravel with Blade is used for the core HTML generation of that sample (not shown in the sample).
- jQuery, Twilio API, Voice Recorder, Assignment Model ( 2013 | source code - text file ):
This example uses jQuery and the Twilio API (an API that supports building of VoIP and SMS applications, https://www.twilio.com
). It handles the operations to utilize the Twilio voice recorder and to save that recording to the assignment model that the recording is tied to.
- Unit Test using Karma ( 2013 | source code - text file ):
This is an example of a simple behavioral test based off the Karma testing framework.(http://karma-runner.github.io
).
Ruby Code Samples
- Financial Accounts Controller ( 2014 | source code - text file ): The file in app-controllers contains a simple controller driving an administrative tool used to export configuration data for financial accounts (for accounting software integration) from a business application. The import/export task is run asynchronously via a background worker so that large data files can be handled without requiring the user to wait on the page until they finish.
- Financial Account ( 2014 | source code - text file ): The file in app-models contains a model of a financial account that stores data used by code that integrates between accounting software and the web application.
- Base Export Job ( 2014 | source code - text file 1 ): The files in lib-import_and_export are part of a custom-built library for handling asynchronous exporting and importing of data from the application's database. This library is built on top of the Resque gem, and uses a Redis database server as its back-end. The import_export.rb module ( 2014 | source code - text file 2 ) contains the business logic used by the controller to handle user requests for importing or exporting data files. The base_export_job.rb serves as a base class for background jobs that export data from the system.

Programming & Other Technical Skills
- PHP:
- Overview:
- I can expertly create web sites, web applications, and web services with PHP. This includes full-scale B2B and B2C SaaS or stand-alone system offerings, legacy rewrites, and extension, correction, and maintenance of existing PHP systems.
- I am a full-stack developer, meaning I can work on the server-side (back-end) and the client-side (front-end) of the application build out. (Moreover, I can also architect software and set up the systems, local and cloud-based, in support of it.) For the back-end side of software development, PHP is what I’ve focused on the most. I've used PHP or have been consulted on how to use it for nearly all of the web-based applications I’ve been involved with for over a decade (since 2001).
- I’ve been following PHP closely since I read about Rasmus Lerdorf’s creation in 2001. I like how it’s run from an open source perspective, and I like that it’s been vetted so well by some of the world’s largest properties: Wikipedia, Facebook, Flickr, Yahoo!, WhiteHouse.gov, etc. Per Wikipedia, it is the most used open source language in enterprise settings, and it’s used on over 80% of all websites whose server-side programming language is known (source: http://en.wikipedia.org/wiki/PHP
).
- What I like about PHP is that:
- It’s a flexible language that lets its developers solve a lot of different web-centric problems. It enables you to quickly write applications and access the necessary components needed to make a website behave on the service side.
- It has a low barrier to entry, allowing people to get up to speed and start to work with the language quickly. It’s also robust enough that it can be used to solve complex problems and be extended in a myriad of ways.
- The language and community around it is very active and interested in making the language grow.
- A good amount of this online resume is dedicated to presenting my PHP experience, and it goes into detail regarding my experience by framework. For that reason, I won’t go into detail about my experience in this section about specific frameworks; this detail is provided later in this online resume, under “Magento” or “Drupal” for example.
- Frameworks:
- “In computer programming, a software framework is an abstraction in which software providing generic functionality can be selectively changed by additional user-written code, thus providing application-specific software. A software framework is a universal, reusable software environment that provides particular functionality as part of a larger software platform to facilitate development of software applications, products and solutions.” (Source: http://en.wikipedia.org/wiki/Software_framework
). A framework isn’t a specific implementation, like a CMS or e-commerce system; it’s common that those software systems were built using a mainstream framework. For example, Magento makes use of Zend, and Drupal 8 uses Symfony2. This section details my framework experience, not my experience with software like Magento and Drupal, which I have many years of enterprise application experience with for each. (On this note, some projects that were handed off to me were quite expensive for the end customer because they tried to make a [pick a CMS, e-commerce system, etc.] work like a MVC framework. It can be done, but I think it is usually done as a reaction; the reason those software systems were selected in the first place is for the forecasted efficiencies (speed of development, resulting in lower costs and faster time to market).) Selecting the right framework is one of the most important decisions in a development project; two analogies would be selecting a strategy for your business or the chassis for a vehicle. It will impact everything going forward in a big way.
- I’ve been using PHP and most of its mainstream frameworks since 2001. I'm able to talk through the pros/cons of various frameworks and toolsets for various applications, and can help with solution design as well as implementation. The frameworks change, and I think it’s really important to understand how these frameworks have changed and will change in the future.
- On a specific note, the PHP community seems to be gravitating towards frameworks that provide a lot of the initial building blocks needed for site development. These tools typically provide an organization to the code to support separation of concerns and allow a developer or team to start right in on building out code to address the problem domain (the context of the work) instead of support code to make the website work (such as user authentication and architectural implementation of patterns). Some common frameworks that are gaining a lot of traction these days include Laravel, Symfony, Zend, and Yii. The fact that they promote a standard for how an application is structured allows for easier onboarding. A lot of the features they provide also encourage best practices on how the application is constructed. It reduces duplication by focusing on achieving DRY (don’t repeat yourself). Most of these frameworks also have ways of building support modules so you’re not constantly reinventing the wheel on new projects. With some forethought, you can build generic modules that solve your current problem and give you tools to solve future ones as well.
- Before there were strong offerings and strong open source groups around PHP frameworks, I even made my own MVC framework (since sunsetted). The exercise was a valuable learning experience, and one big regret I have about it is that I didn’t open source it and help create a community around it since I could have had a first mover advantage at the time if I had.
- In terms of framework selection, I think each is a tool, and each does a good job for a particular purpose; what is the best for each company is based on what the company wants to do and what they value. (It depends, context is everything.) I like to advise clients to assess not just the software when making their selection, but also the strength of the community and the likeliness of a long future. For large investments, a more formalized vendor or framework selection process using Multi-Attribute Utility Theory (MAUT) is not a bad idea; otherwise, a lot of these important decisions seem to be based on what “feels” right collectively, which leaves a lot to chance.
- It’s been great to live through the evolution of the programming language as well as the frameworks that get better and better. I typically observe what the market wants to use in a mainstream, high-volume manner, learn that, and then apply that for my customers. Thankfully, my investment into PHP and its past frameworks makes learning most new frameworks an easy process. When it’s not easy, then it’s typically because some framework was developed in a way that’s not aligned with development best practices. Usually, this is because they grew so quickly that they couldn’t do things properly to keep up; they pay for it later, and hopefully correct it with a new version.
- Here are the PHP frameworks, libraries, and toolsets that I have deep experience with: Laravel, Phalcon, Yii, CodeIgniter, CakePHP, Kohana, Symfony, and Zend.
- “In computer programming, a software framework is an abstraction in which software providing generic functionality can be selectively changed by additional user-written code, thus providing application-specific software. A software framework is a universal, reusable software environment that provides particular functionality as part of a larger software platform to facilitate development of software applications, products and solutions.” (Source: http://en.wikipedia.org/wiki/Software_framework
- Common PHP Software Systems that Companies ask me to Help With:
- Apache, Bzip2, cURL, FTP, GD Graphics Library, IMAP, LDAP, Mcrypt, Mhash, MSSQL, MySQL, OpenSSL, PayFlow Pro, PDO, PDF, PostgreSQL, Pspell, Regular expressions, SMTP, SimpleXML, XML, and ZIP.
- Pear: I've used several packages from it, including the debug framework, phpUnit, and phpDocumentor. I also use the class naming structure that the PEAR libraries follow.
- Composer: Logging, Swiftmailer, and Symfony components.
- Testing: PHPUnit, SimpleTest, and Selenium.
- Code Samples:
- Please see the PHP code samples above
- If you’d like to see more or something custom, I’d be happy to receive your request at peter@pkhanson.com or via 906-281-1178.
- Case Studies – Examples of PHP Work in More Detail:
-
One of my clients is a leading developer of software in the healthcare and patient management market. The application with which I’m currently working is built to manage the entire lifecycle of patient medication management. This includes tracking patient home medications, managing doctors’ orders for additional medications, and generating documentation that is sent home with the patient upon discharge from the medical facility.
- The application is highly customizable for each environment in which it is deployed. In addition, all interactions with the system are audited and logged for full government compliance. On the back-end, the application integrates seamlessly with the hospital network’s IT infrastructure through the use of HL7 and other custom XML integration protocols.
- The application is implemented using PHP with the Zend framework and PostgreSQL as the database with Memcache for caching. For the display layer the application uses Smarty as its templating system with extensive use of AJAX to streamline the physician and nurse workflow.
- My development and QA team are an integral part of the client’s development group for this application. The development workflow is a modified Agile model with story points, sprints, and code review.
- My team and I have participated in the development of the following integral features of the application:
- Enhanced formulary management interface to allow for easier mapping between the full list of possible medications and those that the hospital network maintains internally.
- Structured the Health Level Seven (HL7) Clinical Document Architecture (CDA) implementation and certification such that the CDA XML document format could be used as an interchange format with other healthcare infrastructure.
- Added extensible event architecture to perform certain tasks upon specific and customizable physician and nurse actions.
- Implemented numerous customer and client requested features for better data visibility, migration, and usability.
- Troubleshot various facility-specific environment issues that customers have reported.
- Applied extensive refactoring and reorganization of the system codebase, which has evolved over a decade.
- We are currently focused on the following roadmap items:
- Extensive expansion to fully support e-prescribing protocols and requirements for controlled substances; the end goal is to replace the doctor’s prescription pad entirely.
- Updates to support the requirements for facilities and certifications for hospital networks outside of the United States.
- Improved support for ambulatory workflows to better enable the needs of certain hospital networks and configurations.
- Implementation of integration and API services for use by 3rd party vendors.
- Continued evolution and refactoring of the codebase to improve maintainability and scalability.
- My relationship with this client has been quite mutually beneficial. I have provided the client with full development workflow assistance from prototyping through to QA. The client, in turn, has provided me and my team with extensive training and knowledge that is specific to the healthcare market and associated requirements. It has been an interesting project in which to participate and the value to the customer healthcare networks is tangible and very rewarding.
- For a large non-profit organization established over 40 years ago with over 22,000 members, I built a new application to serve both as their public-facing informational website and as their membership management and billing tool. The new application provides each of the dozens of chapters of the organization with a chapter homepage and tools to assist the chapter leadership with managing the chapter and their own membership.
- The application is based on the WordPress platform. It makes extensive use of existing WordPress plugins to provide a large amount of out-of-the-box functionality. However, many of the plugins are extensively modified to align their behavior with the existing business processes of the organization. These modifications include:
- Customizations to an event management plugin, event-manager: 1) Enabling the ability to restrict registration to specific types of members. 2) Adding an extensive system for allowing members to sign up for optional event workshops as part of the event-registration process. 3) Providing special business logic for giving automatic discounts to certain types of members.
- Customizations to a member management plugin: 1) Extending the account address management features to support custom business logic for required address components and to allow for multiple addresses on an account. 2) Changing account expiration logic to include a grace period and custom business logic for determining the start of an account’s membership period.
- Extension of the PODS plugin: 1) Modifying the image uploader to permit customizations as part of the process. 2) Giving users the ability to draw a custom-cropped region on the uploaded image for use as a thumbnail.
- The application integrates with Authorize.net to provide automatic recurring billing for membership dues. New members are also able to sign up through the website.
- I constructed a search system to index and allow sub-second searching of a large number of member and member-related records. The search system makes use of MySQL's fulltext index type and uses a message queue system for indexing to throttle the server load. Search pages on the main WordPress application interact with the search system via AJAX requests. A search system was designed to be independent of WordPress to avoid the performance overhead of invoking the WordPress framework on every AJAX request.
- My team and I integrated the application with ConstantContact to facilitate the sending of bulk emails to subsets of the organization's membership. The code uses the ConstantContact API to send a list of member email addresses to ConstantContact. The application provides an interface for filtering this list of members based on customized criteria such as a member’s registration date and type. Once the list of member email addresses is synchronized to ConstantContact, the administrator can either use ConstantContact’s website to send a mailing to the list or they can use an interface built into the application to compose and send a message via the ConstantContact API.
- As part of the transition from the organization's legacy MSSQL-based application to the new LAMP-based application, my team and I wrote migration code to transfer and remap the data from their old database into the WordPress database. This transfer posed several challenges, including:
- Dealing with encoding differences in binary data between Windows/MSSQL and Linux/MySQL.
- Handling time zone and date format differences.
- Implementing support for MSSQL's password hashing method at the application level to allow members' existing passwords to be usable in the new system.
- Before contracting me to work on the project, the client had worked with a developer who spent weeks learning the client’s business and creating an estimate for the project, only to abandon the project shortly after beginning work (most likely after realizing their estimate was an order of magnitude too low). I worked with the client to rescue the project and break the work into manageable phases.
- Building such a large application on top of WordPress was a challenge. Several features and plugins of WordPress required better database indexing and code improvements to be able to handle the size of this client’s member base. WordPress is great for many contexts, but I think this client would have been better served by creating their system on a clean MVC framework. I did talk through this with them, and they didn’t want to abandon their investment to date in the codebase.
- The application is based on the WordPress platform. It makes extensive use of existing WordPress plugins to provide a large amount of out-of-the-box functionality. However, many of the plugins are extensively modified to align their behavior with the existing business processes of the organization. These modifications include:
- For a client in the education industry, I built a testing platform that allows students to take practice tests and prepare for a standard industry test. Schools, adult-education facilities, and correction facilities use this software to help improve the lives of their students. I am proud to serve this company as they make tailored content to help adults without much formal education earn high-school equivalency certificates (GEDs and the like).
- The core application is built on the Laravel PHP framework and uses a MySQL database backend. Laravel was selected because it provides a simple, well-designed framework for MVC applications and many supporting tools for common tasks like database migration, unit testing, and command line scripting.
- The user interface is built on Laravel's Blade template system and uses the Twitter Bootstrap framework to standardize CSS styles across the application and across browsers. Bootstrap’s grid system is also used to implement a responsive design, thereby supporting a variety of different device screen sizes.
- The application’s dynamic behavior is built in JavaScript on top of the jQuery library. jQuery provides a consistent cross-browser JavaScript API and built-in functions that accomplish many common DOM-manipulation tasks.
- In addition to standard multiple choice-type questions, my team and I also built support for more advanced question types, including:
- Questions answered by dragging and dropping images to specific positions.
- Questions requiring the student to mark specific points on a graph.
- Short answer questions that accept mathematical expressions and evaluate them to determine correctness.
- Essay-type questions requiring manual grading.
- The rich front-end behaviors for these advanced question types were built using client-side JavaScript, allowing for responsive and highly-intuitive interaction across a wide range of devices.
- Using the MathJAX library, we implemented support for MathML-based formulas on the user interface. These formulas are used on math tests to better present questions and answer choices. Examples of this functionality include:
- Rendering fractions with question prompts:
- Rendering exponents and root operators in question choices:
- Providing the student with a table of standard formulas:
- Rendering fractions with question prompts:
- The application supports integration with third-party authentication systems through use of the SAML protocol and has been integrated with CornerStone OnDemand.
- One of the client's legacy applications was integrated to support shared authentication between the old and new applications. This allows users to log into either application using the same credentials and provides the ability for a user to transfer their active session from the old application to the new application without needing to re-authenticate.
- To improve the overall quality of the application, my team and I built a tool for synchronizing database content changes between environments. The tool allows content changes to be exported from a database stored in a version control system, and then loaded into other environments. The design allows any database record to be either locally unique to one environment or shared between multiple remote environments. The tool also supports exporting and importing relationships between database records through the use of GUIDs. This tool allows the client to maintain separate production, staging, and development environments and use those environments to improve their quality-control processes.
- Some customers of the client are unable to connect their facilities to the Internet, so an offline version of the application was developed to run on isolated servers on local networks at those facilities. The offline version includes a C#-based installer that installs and configures a web server, database server, and the testing platform application code.
- One client of mine is a leading provider of software that manages used equipment return, inspection, resale, and tracking. The company works with some of the largest providers of computing and heavy equipment in the world to handle end of lease return of assets. The application is built to handle a variety of user roles, requirements, and customer entities; be it the lessee, lessor, resale distribution, or the various members of the client management team.
- The application is implemented using PHP with MySQL as the database layer. Smarty is used for the templating system with extensive use of AJAX to improve the end-user experience.
- I was fully integrated into the client’s development and functional verification workflow based on the Agile/Scrum methodology. I and the team I oversaw were responsible for implementing end-customer and client feature requests while troubleshooting customer bug reports. In addition, I led up QA team members to handle functional and verification testing while our project manager handles the role of ScrumMaster for the project.
- So far, my team and I implemented and contributed to the following aspects of the application’s development:
- Improving development efficiency by implementing Agile/Scrum techniques and workflows.
- Implementing various aspects of a customer-facing return management system.
- Adding new workflows and steps to support the needs of new and targeted customers.
- Refining reporting and tracking features for the client’s administration team.
- Refactoring of the application to use more modern PHP methods such as PDO to avoid security pitfalls and improve maintainability while also supporting more recent PHP versions. These changes were extensive.
- Designing and implementing functional and regression testing protocols to ensure application correctness and improve development throughput.
- Going forward, the client and we will be:
- Expanding the application to support new sales verticals that require alternate asset and user workflows.
- Completing the customer-facing return management system.
- Processing feature requests in the task queue as requested by customers and the client/Scrum Product owner.
- Continuing refactoring of the codebase to improve extensibility and maintainability.
- It has been a very interesting project in which to participate. The domain knowledge required for the application is extensive and it has helped us to improve techniques for maintaining and distributing this information in order to quickly bring new team members up to speed. Outside of our normal development contributions, I have provided value to the client by recommending and implementing a new development workflow based on Agile, which has improved development throughput significantly while also greatly improving task queue visibility for upper management.
- I've been working with a company for many years now that operates in the meeting logistics space. Through them, we help end clients such as Sanofi. They help their customers set up meetings, organize venues, manage travel arrangements, and manage all resulting data from the event. They have a web application that centralizes their operations, from meeting definition to handling registrations to a myriad of reporting features. Since work was started on their legacy code base, many new and valuable features have been added, workflows streamlined, and a lot of code refactored to bring it up to modern standards. Here is more information about what I helped them with:
- Used a model and model manager design approach for handling the core data elements of the application.
- Centralized the toolset for sending emails, using a nice email template approach for all outgoing communications.
- Built a survey component that uses a template and template instance design approach so that surveys can change over time and the context of historical survey results aren't lost as the template changes.
- Created a series of unit tests for core/key models of the application, which really sped up future development and helped the client become more confident in the system again. This work has paid for itself multiple times over.
- Built a full set of regression tests for the key workflows of the application.
- Established a tool that can save various export operations into a set of configurations that serve as a report. In this way, those export queries can be run again to quickly produce the data from the system that is needed for offline processes.
- Employed wkhtmltopdf for a series of report output operations.
- I created a database patch management utility, DbPatch, that provides a standard way for developers on a project to track any kind of SQL change and then store the patches for the database in a version control system of one’s choosing. The tool provides a way to create new patches and a way to apply patches to a working copy database. The toolset will review all patches in the repository, review what patches have been applied, and determine the unapplied differences based on a normalized timestamp order. Recently, I rewrote the whole toolset to use Symfony's console package and streamlined its operations. The database operations were refactored to use PDO so that just about any supported database driver could be used. Here’s more information about it:
- Uses PDO for database operations because it provides a driver-agnostic way to interact with a multitude of different database engines. In the config, one only needs to define the driver as part of the DSN string. It’s consistent with the push to standardize on the use of PDO as well.
- Makes use of a tokenizer to separate and parse SQL statements out of a SQL patch file so they can be applied properly in PDO query().
- Employs PHPUnit tests to do core model coverage.
- Permits installation via either Composer or through git submodules.
- Contains a flexible configuration manager that supports patch management of multiple databases. Many larger projects will use more than one database, requiring patch management for all of them. It can also use a hidden field for location references.
- Github link: https://github.com/petehanson/dbpatch
- I created a Laravel-driven system for a data and analytics startup out of Silicon Valley. The customer has a unique solution for doing end-user satisfaction and feedback surveys. There are two basic components of the system: 1) A JavaScript-based toolset that’s loaded into customer websites to provide the survey. 2) A centralized service built on Laravel that transmits survey data and receives survey posts from the JavaScript widget.
Laravel is a very straightforward framework to use, which is a reason why it’s rapidly gaining in popularity. You can get quite complex with the design of your application or can leverage it in a really simple manner. Laravel also has a clean and simple way to define and develop a REST API, which is a core piece of this project. For this project, not only was I able to handle a series of CRUD operations, but I could also set up specific endpoints to handle some logic around question selection when calling for widget configurations – all done quite easily. Any framework could have ultimately handled this workflow, but Laravel made the setup process an easy one, the involved parties all had experience with it, the documentation is straight forward, and the learning curve is light for anyone new that has to support the application. Here’s more information about the resulting system:- Employed Elasticsearch as a document store system and leveraged its search capabilities for the analytics reporting on the backend of the system. Elasticsearch was selected because it’s a flexible document system, it had an already-written-and-vetted driver that could be leveraged in Laravel, and it provides a simple way to cluster and scale.
- Created a third-party-class structure for working with Elasticsearch and rolled that into Laravel models. The value this offered is that is it gave me a standard interface for model operations to and from the Elasticsearch driver, resulting in streamlined data queries and persisting.
- Communicated with the JavaScript toolset via REST API. This approach was used because it’s low overhead and provides the initial foundation for a full-service-oriented architecture for the system.
- Made use of a server-side-generated survey display that uses the same JavaScript components for use in iframe implementations of the survey. This uses Laravel’s Blade Templating Engine to render the necessary HTML and handle the JavaScript initialization needed for the survey.
- Many clients who I serve host and run solutions on Amazon Web Services (AWS). (This is something I’ve been doing for some time as well, and I’m often hired to help with such services.) One gap I've seen in the solution is a good way to back up data. EBS volumes have persistence and the snapshot solution Amazon has for EBS volumes does a decent job of capturing a database's state at the time of the snapshot. Given that, I built an AWS Backup solution using Laravel that provides scheduling capabilities for this snapshot process so the snapshots can be scheduled out and rotated like a typical backup rotation. By taking this approach, the account doesn't get a lot of older snapshots cluttering up in their account, which would result in unnecessary storage fees. Here’s more about the system:
- Uses the standard PHP SDK from Amazon for API interactions.
- Requires a minimal set of permissions to do its job, making it easy to configure an IAM User with the necessary permissions.
- Packages up easily into an AMI for fast deployment.
- Supports snapshots of RDS implementations as well.
-
One of my clients is a leading developer of software in the healthcare and patient management market. The application with which I’m currently working is built to manage the entire lifecycle of patient medication management. This includes tracking patient home medications, managing doctors’ orders for additional medications, and generating documentation that is sent home with the patient upon discharge from the medical facility.
- Databases and data stores:
- I have another section about data. This section primarily pertains to some specific points about PHP and databases.
- Though I think I’ve used every mainstream database and many non-mainstream ones too, here are some of the examples of ones I have used: PostgreSQL, MySQL, NoSQL (MongoDB / SimpleDB / CouchDB / Solr / Elasticsearch / HBase), PL/SQL, SQLServer, DB2, and Oracle. Given I use some data storage mechanism for every project I create, whether it’s SQL or NoSQL, I am confident in designing, creating, and working with data no matter what technologies drive it.
- MySQL tends to be the most common database I’ll work with on PHP applications. One simple reason for this is because most legacy PHP systems that I interact with are built on MySQL. This has started to change since PDO has become the norm for database abstraction. Regarding MySQL specifically:
- I’ve tackled and worked in most of the areas of the database, from basic usage (typical SQL queries), to function programming and triggers in the database, to managing deployed implementations.
- I’ve managed standalone, Master/Master clusters, and Master/Multi-Slave clusters. I’ve configured these directly on systems running MySQL and using AWS’s RDS server.
- I’ve worked with MySQL since its 3.X days in 2001.
- PostgreSQL isn’t as common in PHP solutions, though it’s still fairly popular and becoming more so with the use of PDO.
- I prefer the sequence approach for managing table IDs over MySQL’s auto_increment implementation.
- I’ve worked with some of the GIS features that can be installed into PostgreSQL.
- I’ve implemented a few routines in Perl and installed those in PostgreSQL using PL/Perl.
- I know this database deeply.
- Elasticsearch (and really all mainstream NoSQL technologies) provides a great write throughput. As presented in the JavaScript widget project mentioned in this resume’s JavaScript case study section, this is really useful when there are a lot of write requests from a lot of different sources. With a tool like Elasticsearch, scale is easily achieved by configuring a cluster. The node architecture for the service side scales really easily too since both the database and web server for the API are on the same node. All of this means that the system can scale up or down nodes as the load changes on the server and the whole setup is completely peerless.
- When I’m defining the organization of an application, even with the great abstractions that exist now for PHP, I still prefer to have a wrapper in position around the database operations. Essentially this is a data provider interface that handles data operations from the domain context (like calling getPersonList() instead of having a direct PDO query for select * from person). This gives options and future flexibility for how you may work, retrieve, and persist the data. For example, by taking this approach it would be very easy to wrap a data-caching solution around a call like getPersonList() without having to make a change to either the client code that runs or how the data provider talks to the database. Good practices of OOP help (as always!) ensure this is seamless from the PHP side of the fence.
- I also deal with a lot of SQL optimization work. Normally, I find this problem is usually caused by incorrect indices set up on the tables the system is running queries against. I’ve used EXPLAIN quite a bit to determine where the inefficiencies of queries exist and what to do to correct them.
- Interfaces:
- I have other sections about data transmission. It’s uncommon now-a-days to write a system that doesn’t interact with another system, so I work in this area nearly daily. This section pertains to some specific points about PHP and data transmission.
- The most important rule I follow when it comes to implementing a third-party API is to always wrap it. This means defining a class or function that contains the API call(s) that are being made. Some of the benefits:
- This provides a standardized interface that the rest of the application can then use.
- It allows making changes to the API easier. If the API signature changes, you just have to update it in one spot and the rest of your application is set.
- It’s easier to mock the API if it’s wrapped. For testing purposes, you can simply create a mock object that represents the API workflows and, when testing, configure it to return or accept responses that you’re needing to evaluate.
- PHP does support the most common types of API interfaces out there (SOAP, REST, and XMLRPC). I’ve used all three extensively, though lately it’s mostly been REST APIs. They’re easy to work with and interactions are through typical HTTP calls, which cURL handles quite nicely in PHP.
- Systems:
- I have other sections about my systems experience (Amazon Web Services (AWS) and Management, Mac and PC/Windows Server Experience, and Linux/Apache & Systems Administration). Prior to focusing only on software, I ran a systems and software firm that I founded in 1995, which served both businesses and individuals. I built both Windows and OSS servers, and I was an early adopter of cloud technologies through Amazon. Over the years, I’ve helped 100s of clients set up their systems, troubleshoot systems, and design systems for scale. This section pertains to some specific points about PHP and systems:
- Most system setups I get asked to do focus on the LAMP (LEMP as well now) stack. I’ve been using Linux in some manner or another since 2001. For my needs alone, I’ve run a multitude of servers running a variety of distributions over the years (RHEL, CentOS, Ubuntu, Debian, Slackware, Gentoo, etc.). Before moving to AWS in 2006, I even had a few rack setups. I probably help at least 20 customers a year with their particular needs.
- Apache has been a pretty consistent constant on most of these computers. Though lately, I’ve been using Nginx (the “E” in LEMP as it’s pronounced “engine-x”) more and more. The performance is a bit better and it has less of an impact on the system.
- Most of my experience configuring PHP in a webserver is through mod_php or, more typically lately, running FPM for PHP.
- For a six-year stint, I ran Gentoo Linux as my primary OS on my work station. As a result of my “eating my own dog food”, I had a great inside view of how Linux is organized, built, and configured so I learned a lot from that. For example, I have no qualms about rolling my own Kernel if it makes sense as it’s a common procedure within Gentoo.
- A few customers have run Windows environments for their applications, usually IIS running PHP via ISAPI. Typically, these projects have also used MSSQL as their database.
- I prefer an approach of configuring a virtual machine to represent the application stack for a software solution, especially for a complicated one that requires services other than your typical webserver and database server. I’ve used Vagrant, with various flavors of Linux for this purpose. It allows me to define a straightforward config, along with customized scripts to configure the environment exactly as needed. This configuration can easily be shared among developers, ensuring that everyone is developing on the same platform. As an added benefit, these configurations can even be used as a build point for staging and production systems.
- I can administer servers at the command line, and I’ve supported servers in support of PHP systems since 2001. So, I’m really confident in being able to combine both systems and software efficiently and according to best practices. I didn’t know it at first, but over the years I’ve learned that this is somewhat of a rare combination in that it’s usually the situation that systems team members do systems work and software team members do software work. Given I can do both, I’ve earned some interesting projects, such as one for a client that does systems monitoring (power, network, storage, servers, applications, cloud, etc.) for governments and even white-labeled systems for AT&T and Cisco. A couple of others had to do with PHP servers working with low-capability devices, such as mobile power meters and personal-metric tracking devices.
- Architecture:
- Most of the code I write is object-oriented with a focus on reusability and resiliency to change. I’m comfortable writing procedural and non-object-oriented code (I’ve used BASIC, Fortran, Pascal, Procedural COBOL, etc.), but it seems like many companies are going away from this form of development.
- Here are the patterns I apply in PHP regularly: Strategy, Factory, Façade, Front Controller, Module, Proxy, Singleton, Command, Decorator Pattern, Template Pattern, Adapter, Lazy Init, Iterator, State, Observer, MVC, Skinny Controller/Fat Model, and Inversion of Control. I am familiar with and could apply most patterns here: http://en.wikipedia.org/wiki/Software_design_pattern
- Here are some design principles I apply in PHP regularly: Encapsulate what changes, DRY principle, Program to interfaces and not implementations, Open for extension but closed for modification, and Focus on loose coupling.
- I also can also architect systems and databases for scale. Please see my section about scalability
- Please see the Previous Work section for project examples.
- Testing:
- For PHP work, I’ll typically use PHPUnit and Selenium. I’ve used SimpleTest as well and have been taking a look at Codeception.
- For both unit tests and functional tests, I’ll wrap up and utilize PHPUnit typically as the test assertion framework. For functional tests, I’ll use a Selenium wrapper so I can carry out browser operations. By running these in PHPUnit (rather than say through the Selenium IDE), I can make further decisions and checks when the functional workflow ends. Example, say the use case is “Fill out this form and submit the new user registration”. With Selenium, I can fill in the form and confirm I get a success page. If I use PHPUnit with it, I can also assert that the MTA sent the user an email. I can make a DB call and ensure that the new user was added to the USERS table and has the write state flags assigned.
- I’ve built up some standard support utilities for use in the PHPUnit bootstrap process. Basically, simple ways to mock database operations and handle a myriad of filesystem operations for toolsets that require that kind of system interaction. I’ve also helped guide a testing framework toolset for running regular/recurring functional tests on a series of sites that are covered in its configuration.
- I have more information about testing in general in my Testing or Quality Assurance (QA) section
- Why PHP?
- PHP is Simple to Set Up and Use: PHP is based on C-style syntax so anyone who has programmed with a similar syntax will pick it up quickly. The learning curve is short, making it a good language for someone to jump in without any sort of intimidation, especially if their aim is to do web application development. (That doesn’t mean they’ll do it expertly, but they’ll get there after some years.) The documentation is well organized and especially useful if you need to figure out how a particular piece of it works. A lot of the overhead with variable declaration isn’t needed, allowing very simple and quick scripts to be written (more on this below though).
- Focused On Its Task: It’s really geared to process and handle web-stack operations. It does a lot of the heavy lifting for giving you access to server data, cookie data, and query string and post data. It’s very easy to manipulate the response back to the browser in any way you see fit. More on this:
- Header Control: PHP’s flexible header() function is a Swiss Army knife-like function that allows any type of manipulation of the page headers before content starts getting sent back to the browser.
- Cookies: Cookies can be a bit complex when being accessed raw. PHP’s approach puts all the relevant/needed cookie information into a simple array construction that you can use in your applications. Setting cookies is done through a straightforward function call.
- Session Management: PHP’s session management is quite useful for managing the state of a site visitor. Developers will keep authentication parameters here. PHP provides a lot of flexibility on how you track sessions. By default, it uses temp files, which can easily be configured to store the sessions in a database or a key/value store like memcache, which is generally the preference when you have a load-balanced site.
- Buffer Management: There are a series of features that allow the developer to control how browser responses are sent, managed, and flushed.
- Community: The PHP community is large and varied. The language has been around for quite a while (15 years+) and in that time has seen a lot of revision. Really excellent frameworks based on PHP have sprung up in that time, giving developers a lot of toolsets they can build on top of, in addition to attracting more developers and business users. You can find just about any manner of plugin or module now, especially with the popularity of Composer and Packagist and their inclusion in projects. (That does not mean they should be used, of course.)
- Flexible: PHP allows developers a number of ways to utilize it and how it’s responding and processing.
- Event-based System: There are tools that give PHP an event-based response/feel, similar to NodeJS.
- Functional Programming Components: It’s very easy to implement a functional paradigm with PHP. Its OOP constructs aren’t a requirement to use and in later versions it offers anonymous functions and closures.
- HipHop: Driven by a team at Facebook, HipHop is a JIT compiler that can bring a lot of speed enhancements to a PHP script, getting more bang for your server buck, so to speak.
- Criticisms, Along with Some Rebuttals: As I alluded to above slightly, some of the advantages of the language are also its weaknesses.
- A weakly-typed language does make it easy to get into programming. There isn’t a lot of variable management work that goes into a language, like say for Java or C#. However, because that feature isn’t there, the programmer really has to be careful and disciplined. A mistyped variable name can make debugging difficult, and you might need to scan the code for a while until you find the typo. It’s also possible to accidentally repurpose a variable for a completely different purpose (having $person represent a name and then assigning an object to it as one example). This weakness can be offset with some careful management of variable names and getting into the habit of initializing variables or properties to ensure you’re not using them in unintended ways.
- PHP does have a low barrier to entry, and this is very good for PHP’s continued existence. However, with a low barrier to entry, you can get a lot of hobbyists that come in, walk through a few tutorials, set up something basic, and then call themselves an application developer. Other developers can figure this out quickly, but business stakeholders without development backgrounds cannot as easily (they need to find a developer they trust to do the vetting). The systems they put together are highly coupled, brittle, hard to maintain, and very often have to be thrown away in their entirety. It takes discipline to be a software engineer and improve one’s craft, and a tool with a larger barrier to entry encourages people who are serious about learning the tool to invest more time into it. In other words, it dissuades some people who are not truly interested in being developers, just like how the university I attended used advanced math and chemistry to determine who wanted to be an engineer. During interviews or when evaluating another developer, it’s why I’ll look for examples of work and ask for a code test. A quick review of those will tell me what type of developer they are.
- PHP is often presented as being an insecure language. There have been at least a few developers who’ve subtly or not so subtly scoffed when they heard that I program a great deal in PHP. Some of them have made claims about it being insecure, but they never give any reasons why when asked for details. Albeit, there were a few decisions of default behavior that, in hindsight, were not good choices, but you can disable those easily. In current versions of the language, those capabilities (magic quotes, evil!) are disabled by default. Most of the time, vulnerabilities manifest because of how the application is put together. If an app is designed with a security hole, the programming language you’re using isn’t going to solve that problem for you and it shouldn’t be expected to because that’s the programmer’s job. I can just as easily write a Java program that will suffer from SQL injection (for demonstration purposes only). It’s not the language, it’s the person behind the keyboard.
- PHP is based on a series of C libraries that give access to a whole slew of operations, like string manipulation, array operations, math operators, etc. That can be helpful, but the downside is that these functions aren’t always consistent, which I need to admit is a bit of an irritation. However, by using the PHP manual or a good IDE that does code completion, this situation is minimized. Once you’ve used the language for a long time, this rarely is an issue.
- Other PHP Notes:
- Passing it on:
- I’ve mentored 4 students in a pro bono manner.
- I also have given academic and professional presentations to students and colleagues.
- Whenever possible, I encourage friends and their children to pursue computer science in general if I see that they have an interest. I’ll try to introduce systems like Scratch (http://scratch.mit.edu/
, http://wiki.scratch.mit.edu/wiki/Alternatives_to_Scratch
) to them, as well as explore engineering-focused K-12 activities offered locally.
- For someone with more experience, usually in the late high school to early college range, I recommend the following approach:
- Read “Clean Code: A Handbook of Agile Software Craftsmanship” Amazon Link
| My book review
. This great book is applicable to any language and teaches a lot of excellent concepts to help you learn to write good code.
- Read more:
- Uncle Bob, from the book above, should cover this, but I also advise to look up the acronyms SOLID and STUPID. They give some great guidance on approaches/principles for setting up software. (Links: http://en.wikipedia.org/wiki/SOLID_(object-oriented_design)
and http://nikic.github.com/2011/12/27/Dont-be-STUPID-GRASP-SOLID.html
)
- For a good general outline of best practices in PHP, these are great: http://www.phptherightway.com/
and http://www.php.net
. I’ll claim the latter is the definitive reference for PHP functions and features. The former is a great site put together by a number of thought leaders in the PHP community that nicely presents a lot of the PHP best practices that developers should follow. For general discussion, not necessarily definitive, but often useful, I follow http://www.reddit.com/r/php
.
- Uncle Bob, from the book above, should cover this, but I also advise to look up the acronyms SOLID and STUPID. They give some great guidance on approaches/principles for setting up software. (Links: http://en.wikipedia.org/wiki/SOLID_(object-oriented_design)
- Create something:
- You can start out with something simple, like this: http://php.net/manual/en/tutorial.php
, which will give one a quick overview of what it’s like to program in PHP.
- Then, you'll want to move towards working with an example/tutorial that brings separation of concerns (SOC) into play. MVC frameworks basically allow for a level SOC between your application logic, routing/handling logic, and your presentation logic. To do this, one of the common MVC frameworks would work well. Most are simple to pick up and apply for simple applications, and there are great tutorials out there now. With this background and overview, one of the best things I find to learn a new technology is to apply it to a pet project of yours or someone you care about. Basically, answer this question: what would be fun or useful or both to build? Who knows, you might make your own business in the process of doing this.
- You can start out with something simple, like this: http://php.net/manual/en/tutorial.php
- Read “Clean Code: A Handbook of Agile Software Craftsmanship” Amazon Link
- I care deeply about education, and besides giving back as a mentor and advisor to those interested in entering the software development field, I have donated time and money to schools and related events. The latter includes a SMART Board to the Joplin, MO school system, an online survey system to improve science performance as measured by ACT scores, and hands-on self-defense training and seminars. Through 2019, I served on the Technology Association of Iowa's Workforce & Education Committee.
- Personal Education:
- There are some PHP books that I’ve read and like:
- “Programming PHP” Amazon Link
- “PHP Cookbook: Solutions & Examples for PHP Programmers” Amazon Link
- “Programming PHP” Amazon Link
- Programming In General:
- I follow software development and architecture sites mostly, not specific to a language. However, I do follow http://www.reddit.com/r/php as mentioned and frequent framework-specific sites when working through a particular problem or build-out.
- Some other books I really like in addition to those mentioned elsewhere in this resume (Agile and usability).
- “Clean Code: A Handbook of Agile Software Craftsmanship” Amazon Link
| My book review
- “The Clean Coder: A Code of Conduct for Professional Programmers” Amazon Link
| My book review
- “The Pragmatic Programmer: From Journeyman to Master” Amazon Link
| My book review
- I’ve always thought that a good programmer should be well read on a number of subjects and languages. Broad exposure gives you access to new ideas and ways of thinking about problems. Here’s an image of a couple of my bookshelves:
- “Clean Code: A Handbook of Agile Software Craftsmanship” Amazon Link
- There are some PHP books that I’ve read and like:
- Passing it on:
- Overview:

- Closing Remarks: As I hope the length of this section indicates, I have used PHP for a long time and I will continue using PHP for as long as it exists and the market wants it. Given the current trends that I’ve experienced, it will be around for a long time.
- Data: two of my core strengths are database design and database management:
- I have been using databases and SQL since 1997. It is one of my core strengths.
- I manage or have created hundreds of different databases in MySQL/PostgreSQL for different purposes and different clients.
- I have used NoSQL, such as MongoDB, SimpleDB, CouchDB, Cassandra, HBase, DynamoDB (AWS) for years now. Of note was a B2B SaaS offering using data stores (Mongo and Elasticsearch) that I contributed to. Another was for Yahoo! There are many types of NoSQL databases (wiki
), and I understand and can apply the right one for the right purpose.
- Related to big data, I have used Redis, Memcache, Lucene, Solr, ElastiCache, and Elasticsearch. By virtue of how information is indexed/cached, these are NoSQL solutions even though they aren't marketed that way given their primary reasons for existence. (Branding and positioning is important in IT too.) Some example work:
- Utilized the Elasticsearch river system to feed in relational data for indexing purposes.
- Built a robust search interface to provide various query mechanisms to return result sets from Elasticsearch.
- Configured auto-scaling solutions for Elasticsearch on AWS EC2 instances. This allowed the auto-scaler to start new instances based off of a specific AMI that represents a cluster node for the application. Elasticsearch would grow and contract based on how the auto-scaler adjusted the cluster.
- Configured and used Redis in a pub/sub model for handling event management in an application.
- The problem domain that I’m working within is one of the key factors that determines which technology I’ll use. Some projects are better served by a RDBMS, others tend to be enabled by document storage, and some are best written with a mix of both. Large volumes of writes and distributed content creation are usually indicators that I’ll need to look at a document store system. Stated differently, NoSQL comes into play when the solution requires quick collection of data with low overhead or when the data sets vary substantially from one another. If the data has a lot of structure and relationship to it, then a RDBMS is what I would consider. Many of the solutions I’ve assembled utilize both a relational database and a NoSQL database side-by-side. For example, in one project, we’re collecting massive amounts of survey data in the NoSQL system, normalizing it, and then writing it out to a MySQL database for efficient data analysis purposes.
- I created a database patch management utility to help with some of the housekeeping work that’s required in larger projects, and it saves a lot of time.
- I'm able to take a concept, design the database formally using Entity Relationship Diagrams, Data Dictionaries, and other supporting modeling conventions as needed, and implement the database.
- I'm well versed in using existing databases, whether it be simple usage, extension, improvement for faster and safer performance, upgrades, or other aspects of database usage and maintenance.
- I have extensive experience writing SQL and know nearly every nuance of the SQL 99 standard (basic queries, types of joins, utilizing sub queries, views, temp tables, transactional processing, etc.), including a number of proprietary features like stored procedures and VBA scripting in Access.
- Though the majority of my experience is with MySQL and PostgreSQL, I have experience with these as well: Oracle, SQL Server (including migrations), Informix, DB2, FileMaker, Caché, Access, and FoxPro.
- I've modified existing open source databases to perform better by adding or extending features.
- I've created systems with shared database access among different applications, running on different platforms.
- I am well versed in advanced database operations, such as transition management, triggers, views, complicated queries, and stored procedures.
- I've also used a number of abstraction layers for database communication, including PDO (PHP), DB (PHP), ADOdb (PHP), and NpgSQL (more of a driver than abstraction). Features utilized in these layers on past projects include: stored procedure execution, caching and performance enhancers, record set selectors, and performance monitoring.
- Most of my applications that involve a database are built with a datasource layer between the main system objects and the actual SQL queries that manipulate the database, accomplished using the Domain Object Model and the Data Mapper pattern. I've also written applications that use the Active Record pattern. More recent development has made use of object-relational mapping (ORM) tools, including Doctrine and Eloquent.
- I have used advanced stored procedures and triggers to automatically update dedicated un-normalized history tables when data in the primary tables were changed.
- SQL
- In all of my programming work, I've used SQL for database operations.
- I've written queries for accessing data directly and routines that assemble SQL on the fly.
- I've worked with existing SQL statements, both using them as is, and optimizing them if needed.
- I have more information about my PHP data and database experience above.
- JavaScript / AJAX / Rich Internet Applications (RIA):
- Overview:
- I am a full-stack developer, meaning I can work on the server-side (back-end) and the client-side (front-end) of the application build out. (Moreover, I can also architect software and set up the systems, local and cloud-based, in support of it.) For the client side of software development, JavaScript is a member of the important front-end triad of *HTML*, CSS*, and JavaScript. I've used JavaScript or have been consulted on how to use it for nearly all of the web-based applications I’ve been involved with for the last decade.
- I can expertly create web sites, web applications (including single-page applications or SPAs), and web services with JavaScript, making use of prototypal inheritance, other OOP programming approaches, modules, and standard design patterns to do so. I can structure code for resource optimization/minimization for high-performance and scalable JavaScript code.
- Frameworks:
- I’ve been using JavaScript and most of its mainstream frameworks since 2003. I'm able to talk through the pros/cons of various frameworks and toolsets for various applications, and can help with solution design as well as implementation. The frameworks change, and I think it’s really important to understand how these frameworks have changed and will change in the future.
- On a specific note regarding frameworks, the Model View ViewModel (MVVM) JavaScript frameworks are one set of technology (AngularJS being one of those) that I’m following closely and making use of in enterprise settings. I see the web moving in that direction: more client-side interaction and more like a desktop GUI app than the web pages we have been interacting with for decades. I’ve used most mainstream JavaScript MV* frameworks and libraries, and I can talk through pros and cons of each.
- For mobile applications, I’ve used hybrid frameworks and supporting toolsets such as PhoneGap, Appcelerator’s Titanium, Sencha Touch, trigger.io, Corona, Dojo Mobile, jQuery Mobile, and Apache Cordova to create multi-platform and multi-browser applications that are easy to maintain. I know how to use JavaScript for high-performance packaging of mobile and tablet applications.
- It’s been great to live through the evolution of the programming language as well as the frameworks that get better and better. I typically observe what the market wants to use in a mainstream, high-volume manner, learn that, and then apply that for my customers. Thankfully, my investment into JavaScript and its past frameworks makes learning most new frameworks a fairly easy process.
- Here are the JavaScript frameworks, libraries, and toolsets that I have deep experience with: Node.js, AngularJS, Ember.js, Backbone.js, MooTools, Ext JS, PhoneGap, Titanium, Corona, Apache Cordova, Dojo, Dojo Mobile, jQuery, jQuery Mobile, Prototype and Script.aculo.us, Knockout, Rafeal.js / Raphaël, YUI, and probably more by the time you read this as I don’t update this monthly.
- Code Samples:
- Please see the JavaScript code samples above
- If you’d like to see more or something custom, I’d be happy to receive your request at peter@pkhanson.com or via 906-281-1178.
- Case Studies – examples of JavaScript work in more detail:
- I did a lot of work over the last year for a venture-funded startup focused on adaptive commerce with the goal of personalizing shopping beyond what standardly exists today. Their project was an e-commerce overlay system where they augmented existing catalog data with personalized filters and recommendations for the data sets of their client websites. It handled injected JavaScript, had Cross-origin resource sharing (CORS) compatibility, had NoSQL data stores (Mongo and Elasticsearch), and used lightweight custom JavaScript objects in their newer implementation. More information regarding some of the items I helped with:
- Worked on a tool that could be injected into a customer website to determine if there were any glaring issues with how the toolset would run in that environment.
- Helped analyze and document how their primary configuration generation tool operated.
- Architected and organized how their new version of their platform would ultimately be used to manage a new site implementation from start to completion.
- Debugged a series of performance issues with an older deployment of their client-side tools.
- For one client, who offers omnichannel personalization for clients like Best Buy, CDW, Target, Wine.com, Neiman Marcus, Costco, Burton, and Walmart, I helped write up a set of tools and tests to help ensure stability of the client-side libraries that their customers use. More information about how I contributed:
- Built out a series of comprehensive behavior and unit tests for their client-side injectable component (custom JavaScript objects), which were executed through Jasmine. The component provided all of the access into their core system. These tests ensured that stability would be preserved when the component was modified (it would trip a test either from a function signature issue or behavior change). If there was an error or bug in the component, it would break all integrations with the backend solution so it was an important set of tests. There were approximately 1,500 assertions that went into the test suite. QUnit is another viable alternative that I explored for test execution.
- Built out an injectable detection tool that would execute a series of validation checks to determine if the customer had their local environment properly configured and was using the injectable component correctly.
- Designed the detection tool to accept and deal with CORS issues.
- Set up the detection tool to be library light so a minimalistic set of components were required to make the tests/validation work.
- Created the tests so they could be used in a harness where the page could be loaded and the tools injected and the test results parsed so the harness could be run on a schedule to keep an eye on customer sites to ensure that they continued to stay compatible with the client-side tool.
- I also did a lot of work for a Fortune 10 customer over the last couple of years. Their system dealt with nuclear power plant inventory and procurement, and it utilized both AngularJS and Twitter Bootstrap in its construction. It used a typical Java enterprise stack (jBoss, Java, and Oracle) for the service layer. Most of the functionality was handled client-side in AngularJS however. More information about the project:
- AngularJS helped as it allowed good separation of concerns to be enforced on the client side. I was able to model the data for users, plants, inventory, parts, etc., all on the client side. This information would be saved and managed there and synchronized with the service processes. It resulted in a streamlined experience for the user that was exceptionally fast, one of their core requirements. Along with that, the breakdown of services and controllers and ensuring that DOM manipulation stayed in directives allowed easy testing of the application's code.
- Twitter Bootstrap was also used to normalize a lot of presentation items in the application. Since it had to support a wide array of browsers (IE8 support included), as much as possible was used from the frameworks to ensure consistency.
- A feature requirement was mobile support so the groundwork was laid for responsive design through Bootstrap’s use of device-specific classes.
- D3 was used for a series of reports and pie-chart graphing in the core UI, showing break-out classifications for various parts.
- I was called on to help the client’s internal systems team troubleshoot environments and issues unrelated to the work that was performed (the code worked well in a newly-built environment). I used my systems experience and troubleshooting skills to identify issues with application and database configurations from a standpoint of what REST endpoints were doing in the application and how the service layer was handling those requests. Since access to infrastructure was restricted, the solution-discovery approach was a matter of reconstructing test cases and coding out a race condition situation to demonstrate how session data was getting stored inconsistently from one instance to the next in their jBoss cluster. I also assisted with the execution of patch scripts and debugging when different environments went out of sync with the datasets they supported.
- A startup selected me to be their interim CTO in 2014 because of how well that project went. It was a very enjoyable project because I was involved from the start to where they are now. Things have been going so well that the founder was able to quit his day job due to the success of the business he created and bootstrapped. The toolset consists of a widget toolset that can be injected into third-party websites by web administrators. It handles API calls to a centralized service to serve up survey configurations and capture form data submissions that get posted back. The widget has a lot of flexibility in its behavior and how it renders for a given site. The application build out was approached in a Minimum Viable Product (MVP) manner, allowing for the main application to rapidly be prototyped and made public for paying-customer use. Many of the latest features have evolved from specific requests by existing customers' usage, which was the intent of the approach. More information:
- JavaScript-specific object structure for rendering the widget and providing a data model for the widget to utilize on the client side.
- Use of closures for code encapsulation.
- A basic initialization signature was used so the project could use a version of jQuery that was already available on the site. jQuery is the only dependency of the toolset.
- CORS configuration in place to allow the cross-domain API calls from the JavaScript classes.
- Flexible client-side data embedding that can be sent along with form submissions. This enables detailed data analysis of different demographic lines for specific websites.
- Implementations of different form factors of sites (desktop versus mobile).
- Use of Elasticsearch on the backend for survey storage and form post handling.
- For one client, I used websockets to connect a number of browsers together to control the presentation and answering of survey material. A controlling interface would show questions and results and use websockets to communicate state changes, which would route though the app system running on Node.js, and in turn push out the notifications to the other clients. Clients could register/remove themselves from the app and subscribe to a certain channel. The system also handles if a client drops off and comes back online, tracking the socket ID signatures for different clients and tracking their connection status.
- I did a lot of work over the last year for a venture-funded startup focused on adaptive commerce with the goal of personalizing shopping beyond what standardly exists today. Their project was an e-commerce overlay system where they augmented existing catalog data with personalized filters and recommendations for the data sets of their client websites. It handled injected JavaScript, had Cross-origin resource sharing (CORS) compatibility, had NoSQL data stores (Mongo and Elasticsearch), and used lightweight custom JavaScript objects in their newer implementation. More information regarding some of the items I helped with:
- Mobile development:
- I’ve used a lot of hybrid mobile development frameworks that apply JavaScript and other languages to produce non-native mobile applications. The benefit of this is that in theory you can write it once and deploy it everywhere, versus having to maintain code for each of the major platforms out there. Example hybrid frameworks and tools include PhoneGap, Appcelerator’s Titanium, Sencha Touch, Trigger.io, Corona, Apache Cordova, Dojo Mobile, and jQuery Mobile.
- I think it’s useful to cover some terminology for those new to mobile development. Native means it’s built just with and for a specific mobile platform, like using Objective-C for iOS and Java for Android. Hybrid apps work across mobile platforms – on the device, not over the web – using the same code base. You could think of these as web applications that run on the device. A third possible approach are web apps, which are apps delivered over a browser to devices and are designed for the mobile experience. There is overlap across these as a note, such as hybrid applications that make use of native functions, something that’s very common now. Both hybrid and native applications can be distributed via the various popular app stores that are out there. The best approach to take remains debated (web search for “native versus mobile web applications
”). Gartner, one of IT’s premier research firms, claims that by 2016, greater than 50% of the mobile apps that are created will be hybrid ones. (Source: http://www.gartner.com/newsroom/id/2324917
)
- Here are some of the things that a developer should keep in mind when programming for a mobile experience:
- The stack, what software to use to create the system. Like most stack decisions, what is selected depends on what the stakeholder funding the work values, which is hopefully and usually what the users value. I don’t think there is one right or one wrong approach, rather what works for the funding organization. Here are some common decision criteria that are considered in stack selection: user experience, performance, what can be supported easily, fit to prioritized use cases, marketing (some companies like to say they have an “iPhone application” for example, which would be technically true whether it’s a hybrid or native approach), use of native functions and visuals, desired speed of development and updates (time to market for new systems and updates), vendor/provider reputation and forecasted longevity, user technology parameters (ability and specific devices/platforms used), system build-out intent (MVP, pilot, rewrite of a multi-million dollar user app), investment to date (training, cost of development, size of ecosystem and supporting features), flexibility, and budget for initial development and the future, taking total cost of ownership (TCO) into account.
- Speed. Given when and how mobile apps are used, I think mobile users expect more immediate interaction than users of some other systems. Personally, I will abandon a mobile application quickly if the speed isn’t there. To achieve the speed that users expect you have to be conscious of the limitations of the network the user is using, speed of the device, and how much network traffic you’re pushing through the app. Quick response on the server is also a requirement for speed. My general philosophy (you could say this applies to MVVM apps as well) is to minimize the network communication as much as possible by transmitting only the needed data and using caching tactics for commonly-accessed items. If it makes sense to push more of the processing into the client, then that is an approach that can speed up server-side responses as well. Horizontal scaling of server-side resources also helps to increase server-response speeds.
- Create content and workflows for the screen size (of course). A lot of the considerations for this are covered in my responsive design section.
- Manage and develop for user expectations. For example, is there a risk to try to mimic the native user experience if it’s a non-native application? On mobile devices, users do expect the interaction techniques to remain consistent, such as swiping, for example.
- Be aware of other constraints that you don’t normally have to be concerned about on the cloud, such as power management.
- Consider that an app may or may not have Internet access. The following questions should be answered during the discovery phase of the project. How does it respond to that scenario? Or can it work in some manner with the data it has received or is waiting for?
- You typically want to ensure your application is efficient in its resource usage. For example, you don’t want it needlessly communicating on the network or using/consuming memory/CPU process continually. Proper cleanup of objects in memory helps as well.
- I’d like to add a little more context about one of the stack selection components I mentioned, flexibility. This decision criterion relates to the ease of adding new features for specific business needs while being able to reuse code efficiently. It was for this reason that one of my customers decided to go hybrid. They needed to build out customized mobile apps for each of their end customers for interaction with their services, customized manufacturing services for engineering consulting and implementation, and they wanted to do so with the same code base so they could minimize code maintenance and complexity. Flipping this, if one customer needed to write one application to serve all of its customers on one or two platforms, I think, as far as this selection component goes, there is a good case to go native.
- Example Case Studies:
- PhoneGap:
- I built a proof of concept to test the feasibility of migrating a client's existing iPhone application into PhoneGap (their primary driver was so that it could be compiled for multiple platforms). The initial target platforms for this project were iOS and Android. The application was built on the jQuery and jQuery Mobile JavaScript libraries, making use of HTML5 as well. This code drives an initial login page and a settings page that makes use of the device's local storage to allow the user to configure certain behaviors of the application.
- A Phonegap solution was created for a Tennis community site.
- A Phonegap/Java hybrid solution was created for a Beauty/Salon social community/e-commerce system.
- A PhoneGap solution was made for a SaaS real estate showing management solution. This replicated the key features of the site to allow mobile users to perform the tasks they’d most often need to perform while showing and working with their homes and agency’s homes.
- Apache Cordova: This is another mobile development platform that allows access to native functions on iOS or Android devices. (More information: http://cordova.apache.org
). It combines well with other mobile frameworks like the ones listed previously. I used this to help a global contract manufacturer that specializes in electrical and mechanical devices, sourcing and assembly, to create a solution that they could customize for each of their end customers.
- The project is a mobile application that basically uses a remote API to set or retrieve content to and from the site’s database. Every operation is managed with this API and the app contains the logic to process and display the information received from the API. It also contains the forms required to send information to the API, which contains all required validation. I could easily use the same HTML templates, CSS styles, and JavaScript logic to build different platform versions (iOS or Android).
- This app is used to control a number of devices stored in the main database (accessed by an API). The user logs into the app and then they can choose a device from a list of available devices. Selecting a device displays all the associated parameters with a form, allowing the user to modify some of those parameters. The modified fields are then sent to the API to be validated and saved into the database.
- The basic elements in the mobile application are: 1) HTML to create the templates. 2) CSS to style the templates. 3) jQuery to provide extra functionality. 4) Additional JavaScript libraries as required by the application.
- All the AJAX requests are made by jQuery usually by sending a form’s content. The request is then processed by the API, and then a JSON response is sent back to the app. The responses use common fields: status (it can be “success” or “error”), message (related to the operation and error details), and data (in case of requesting a list of items, this field will contain the list items encoded). Some JavaScript then processes the response and the content is displayed by the mobile app.
- Once the app structure is ready and the templates created, Apache Cordova is used to generate the specific version. A new directory is generated with all of the required content of a regular native application for each platform. At this point we can generate all the icons and images to use as splash screens for the app. At the final stage, the IPA (iOS) or APK (Android) files are created using specific tools for each platform.
- Digium: I created a phone JavaScript application for call monitoring. Please see the Digium section for more information.
- Nook usage in the Audience Response System (ARS) that I created:
- Part of the ARS system that I built had a requirement that it needed to work in environments where Internet access wouldn’t be available.
- To account for that design decision, parts of the application interaction were designed to run on Atom PCs that could be set up at the meeting site. A local wireless network could be established for the device exchanges.
- Nooks, a good fit for the client’s intent, were also included in the package that was sent to the meeting site so that attendees without a device could use the Nook like an Answer pad. The application they ran was a web app that was delivered by the Atom PC.
- Closing remarks about JavaScript-based mobile development:
- Programming for mobile does have different design and development considerations, as well as different framework and development approach options. This is exactly how it is for almost all other development so I was able to enter into this domain with an approaching-flat learning curve.
- As I know systems, back-end programming, and programming for mobile development, I’m able to create an entire system for a mobile-focused initiative, not just the mobile component. This results in greater efficiency and a more streamlined solution.
- Based on observation and reading about industry and consumer trends, this market will continue to grow. Given that, I have worked hard to stay current so I can continue to serve those who wish to replicate their web-based systems designed for monitor displays to their users who might benefit from a mobile version.
- PhoneGap:
- AJAX – I keep this buzzword here on my online resume because some people still look for this, and it was important to have it listed at one time:
- I've used AJAX to create two iGoogle-like applications. One is used for an Intranet application that allows people based on their credentials to add various modules to display data and access tools (widgets) relevant to their position within the organization. Another one is used by administrators to update the content they display on their websites. By simply moving modules using JavaScript they can change the look and feel of all the websites they maintain with the administrative features.
- I use XML to support configuration settings and AJAX output on various web applications.
- I wrote extended and custom JavaScript objects to accomplish a particular difficult set of functionality that wouldn't be handled well otherwise by older client/server cycles.
- I know how to use XML and JSON for AJAX requests.
- I've built shopping cart-style systems using AJAX.
- Most applications that I’ve built typically have some sort of AJAX-based calls within them. Pagination, a DataTables setup, and to quickly get/apply filtered data are a few examples.
- Nature of programming in JavaScript versus something like PHP: There are a lot of differences in programming between JavaScript and PHP. The operational environments differ, and you have different mechanisms available to you in each language. I'll outline some of the highlights I've experienced here:
- More functional programming, not so object-focused. One of the most useful/interesting features of JavaScript is how it handles functions. This leads to more a more functional approach than a typical object structure/hierarchy that you might see in Java or C#. Not every feature/component has to be in an object when it doesn't make sense. I tend to find it creates more readable, straight-forward code. The catch of course is following SOLID design principles during the development so those functions or methods provide the most utility possible.
- Event-driven model and handling work in an asynchronous fashion. I've used the event model to help build applications that respond and act more like a typical desktop application. With the current HTTP model that PHP runs under, you make a request and get a response from the server. There is very little interaction that can take place in that mode, without being able to interact with the user through event listening on the client. This has been from simple click handlers to using drag and drop features for more streamlined workflows and interfaces, communicating results back in near real time to the server.
- Module construction. There are a number of ways to build and maintain modules in JavaScript. Most try to minimize the use of the global namespace, which is generally a no-no in programming. I've built out a number of objects that use a typical JavaScript object and then attach functions to different properties to create methods. I've also used closures to create contained sets of functionality that provide a minimized interface and hide the internal workings within the containing anonymous function. I've also leveraged backbone to create, manage, and model layers for use in MVC designs. I've found that using JavaScript in an MVC design really lends itself to Domain Driven Design as well since you’re typically focusing on the application functionality and problem domain you're solving. With this approach, you can expand out in the various directions necessary to persist and present information. Important takeaways:
- Using functions and closures for self-contained items
- App structures to avoid the global namespace
- Basic model creation, utilizing backbone
- Leveraging domain-driven design for module creation
- Module extension, using behavior injection through the use prototype
- Focus on JavaScript development, abstracting out the DOM/Browser specifics as much as possible. I tend to use the same concepts for JavaScript development that I do for PHP development when defining out application functionality and separation of duties (more on this below). I tend to write browser-independent JavaScript for most of the application, using views/presentation layers for the final tie-in to the markup and DOM manipulations that take place. This cuts down on a lot of the cross-browser checking you have to do since the base JavaScript is pretty uniform. Libraries like jQuery help further with DOM interactions, along with AngularJS’s Directive approach. This also allows more flexibility for testing or operations in headless environments like Node.js or using something like zombie or Jasmine for testing.
- MVC-based approach towards development. Typically, I use an MVC-oriented approach for development with JavaScript-based apps. Most of my functionality is organized into modules, where I track data, state, and functionality. Views handle presentation management and there's a lightweight controller to handle interactions between the two. I've used the observer pattern for event registration between view and models as well (pub/sub pattern). This leads to a well-structured application that encapsulates a lot of the headaches that can come up with JavaScript nicely. Key takeaways:
- Utilizing a lightweight controller
- Application functionality, data, and state are all model-based
- Composite views for presentation, generally using a templating system
- Observer (pub/sub) pattern for model/view notifications
- Data exchange with JSON, XML, etc. It’s critical to be able to store and render views on the client so the server only needs to send pure data in some format. I've utilized a number of mechanisms for data exchange in JavaScript. Most common are JSON and XML exchanges, but I've also worked with JSONP and CORS, which help get around the same origin policy that you face with typical AJAX calls. I've also used the local storage mechanisms in HTML5 for persisting local data that is expensive to transmit, but doesn't change often to help make applications quicker. For some systems, I’ve used long polling with custom-built RPC functions.
- Protected Object Management and Execution: One of the important reasons that I've used some frameworks is to handle object management and execution in a protected environment for JavaScript apps. One downside to pure JavaScript is that it is all text and has to be sent to the client. It could be very easy for someone to review the code to see what the application is doing and how it works, which can reveal trade secrets and possibly security holes as well. Also, client-side validation should never be relied on alone. The fact that frameworks like Node.js use the same language as the client-side validation means that you only have to maintain one version of the library instead of two in different languages. This helps satisfy the “don’t repeat yourself” (DRY) principle of software development.
- Hashbang routing. Regarding using JavaScript to drive navigation and rendering (often called "hashbang" URLs), this is an increasingly popular technique. I've leveraged hashbang routing for URL management on the client side, allowing for direct page access, bookmarking, and navigation management, so the back and forward buttons on the browser work as expected. This mechanism also allows me to follow the Google standard for being able to transform that URL into a direct server call with a query string that spiders can use to properly index page content. It is not typically appropriate to apply it to an entire website. Among development circles, one of the most infamous examples of this is the Gawker redesign debacle (if you web search "gawker hashbang
." you'll see a lot of articles related to this). A fairly non-technical explanation of why they are a bad idea can be found here: http://www.tbray.org/ongoing/When/201x/2011/02/09/Hash-Blecch
. A more technical explanation of why can be found here: http://isolani.co.uk/blog/javascript/BreakingTheWebWithHashBangs
.
- Other JavaScript Notes:
- Elastic Beanstalk for Node.js:
- Incorporated a queue class structure around Beanstalk (http://kr.github.io/beanstalkd
) to monitor and track jobs and handle failures in a job process so that it can get inserted back into the queue. It also notifies Beanstalk on long-running tasks so it doesn't get stopped by a timeout.
- Leveraged this to separate long-running or expensive operations from the synchronous processing of UI interaction. This provides more direct feedback and more scalability with processing job-related items.
- Incorporated a queue class structure around Beanstalk (http://kr.github.io/beanstalkd
- Template Libraries: I’ve used both Mustache (logic-less templates) and Handlebars for View construction in a few of the JavaScript applications I’ve built.
- Digium Phone App Development:
- Created an application designed to run on the Digium phone V8 platform. This VoIP app watches for outgoing calls, uses a long polling system (since the platform doesn't support sockets) to keep a constant connection to a centralized service (its registration process), listens for alerts, sends alerts based on pattern matching (not feature-complete yet), and executes a visual display on the phone when an alert is received.
- Handled call monitoring from a local phone standpoint rather than intercepting the call with rules on the phone server. This would match against a known set of numbers (like 911 for example) and when matched executes a request to a shared service that then picks up that notification and follows a series of rules regarding how to alert other phones in the same notification account.
- Created Jasmine tests to verify functionality of the Digium phone app service. They used the jasmine-node module installed from NPM (Node.js) to execute the Jasmine tests. I also used Frisby (a REST API testing framework built on Node.js and Jasmine) to do tests against the API endpoints available in the Node.js process. Jasmine-node executed the tests.
- This could be used in a school setting. For example, if a teacher dials 911, the office can get a notification that it occurred in real-time.
- Prototyped a call look-up system based on an incoming call’s phone number. It would make a request to a REST service on the customer’s CRM, and fetch specific account information regarding the caller that can be reviewed right on the phone.
- Education:
- I have a first or second edition Rhino O’Reilly book around the house somewhere.
- I follow software development and architecture sites mostly, not specific to a language. However, I do follow http://www.reddit.com/r/javascript
and frequent framework-specific sites when working through a particular problem or build-out.
- Closing Remarks: As I hope the length of this section indicates, I am investing heavily into JavaScript because the increase of its use is apparent just by observing the market. Moreover, it’s more and more enjoyable to use because of its power and cross-platform and cross-browser applications. For you O’Reilly readers, the Rhino is charging, and I’m not sure when it’s going to stop.
- Elastic Beanstalk for Node.js:
- Overview:
- Responsive Design:
- I’ve been doing responsive design, which is creating software that adapts well to different views, such as phones, tablets, phablets, and monitor sizes, since before it had such a name.
- Responsive Views vs. Responsive Design – Making a Choice: There are two different approaches to implementing responsiveness on a site. The first one is to make a new site from the ground up to include responsive behavior (responsive design) and the second one is to implement responsive behavior on an existing site (responsive views).
- Responsive Design: Rewriting the page templates using a CSS framework that includes a grid-based layout system will provide the highest-quality responsive layout. There are many good frameworks that provide this functionality, but the one my clients keep asking to use the most is Bootstrap because it does a really good job in this area. It is one of the most popular frameworks and it provides elegant solutions for several known and common issues when implementing a responsive design. In order to get to the point where the CSS rules to produce the right presentation for the viewing area can be used, sometimes the more complex task is adapting the site’s HTML templates or other existing code. In other words, it can be time-consuming to implement responsive if you have to change the entire structure of the site. Getting into more detail about this point, the grid-based frameworks use a series of divs and a styling approach to give the grids a particular percentage of the horizontal real estate. This is similar to tables, but the HTML is designed to be flexible and then responds to the CSS applied to it. In many cases the tables are forced into a particular layout due to the nature of the elements used in the markup, making the code very inflexible, and it would take quite a bit of time to update the code to enable responsive design.
- Responsive Views: It is, of course, possible to build a responsive design without using a framework. The primary reason to consider this would be reusing a site’s existing HTML markup without having to rewrite it into the markup structure expected by a framework’s grid system. What enables this is if the current website has an existing CSS front-end framework and some support for responsive behavior. It’s quite difficult if the current markup of the site contains many hard-coded widths, makes use of tables to control structural layout, and contains some CSS styles within the markup, which means that adding responsive behavior to it through CSS media queries alone would be very difficult and likely impossible (extremely time-consuming and cost prohibitive, that is). In this case, any responsive design for the site will require rewriting substantial portions of both the HTML markup and CSS.
- What to Do?:
- If you cannot change your site significantly, then the responsive view approach is the way to go.
- If you can or it’s a new site entirely, then the responsive design approach is preferred.
- If the current site isn’t built to support responsive, there isn’t an easy winner as either approach will require a lot of work. When the decision is a difficult one, I suggest assessing your drivers for the project so I can help you to determine what I’d do if I were your CTO (the perspective I like to take when offering advice). Adding to the difficultly is that it’s not possible to estimate this out other than in broad strokes unless it’s a very simple site because every line of code would have to be assessed to come up with an accurate number. (Unless that is done, any estimate you receive is a guess, and that is risky for both parties.) The safest way to proceed for the both of us if we were to work together would be to do a few pages, assess the costs, and move forward iteratively. Via this approach we can develop a sense of how much effort a small percentage of the work is, gauge if the work will be feasible without extreme effort, and determine what the timing and cost should be to handle that unit of work. We can then apply that across the rest of the page types/templates to forecast what sort of time commitment the rest of the site would require to implement either the responsive design or views approach.
- Horizontal Screen Resolution vs. Type of Device:
- Good responsive behavior should be designed around the horizontal screen resolution of the device viewing the page, not around the type of device. For example, some phones have display resolutions greater than that of many desktops, so it doesn’t make sense to deliver the “phone version” of the site to those devices.
- A grid-based responsive layout will often define several “break-points” at different horizontal resolutions where the layout of the page shifts. This shift may involve allocating different percentages of the screen width to different columns of the design, or it may involve wrapping some elements of the layout down to new rows.
- Mobile devices can usually be rotated, which changes the horizontal resolution of the screen. A good responsive design should support this and display a wider layout if the new horizontal resolution of the screen is wide enough. When testing a responsive design, it is important to test mobile devices in both landscape and portrait orientation for this reason.
- Additional Features:
- There are additional features that may go into building a mobile-based site that are not necessarily part of a responsive design. For example:
- It may be beneficial to users to serve them lower-quality versions of images, given that mobile connections are often slower and almost always have very small transfer limits.
- Mobile users interact via a touchscreen, while desktop users interact via a mouse. This means mobile-friendly designs must avoid making any navigation dependent upon “hovering” the mouse above something. It also means mobile users may expect certain “special” navigation interactions to work – such as swiping side to side on the screen to scroll through content.
- Part of the planning process may involve determining which features are desired and where they should be implemented on the site.
- There are additional features that may go into building a mobile-based site that are not necessarily part of a responsive design. For example:
- Tools:
- The ability to use CSS grid systems, use CSS media queries, use CSS class hierarchies, understand breakpoints, apply OOCSS, and create content that’s adaptable, such as flexible images, are very important for designing and developing responsively.
- Like with JavaScript and its frameworks/toolkits/libraries, great tools like Sass (Syntactically Awesome Style Sheets), Compass (CSS Authoring Framework). 960 Grid, Golden Grid (GGS), and others are enabling developers to create responsive sites much more efficiently and effectively.
- Final Commentary:
- Clearly, with the importance of mobile computing (usage and volume of transactions) and that trend continuing, responsive design is important if one’s audience expects to be able to use a system across different devices.
- I think responsive design is one indicator that developers and those who work with developers are becoming more aware of the benefits of focusing on user experience at the device level. It means they need to consciously define what is most important for the application, whether it be a feature or piece of content, as the resizing does shuffle features and content according to the limits defined by the screen size. I think it’s also interesting in that it really focuses the stakeholders on how to prioritize various components of the page; for example, navigation elements that are fairly standard on systems might be trumped by the most popular feature or key content. Lastly, looking at it from the opposite perspective, it helps stakeholders identify what’s truly not important, which means some content that could have made it to production in the past now does not, which I think also improves the user experience in most cases.
- Whether for responsive design or not, standard development concepts are always important, such as separation of code and design. Thanks to the concept of responsive design, I think this is getting more attention
- It’s rare when I make a site that isn’t responsive or can support responsive design easily in the future.
- XML
- At Up and Running, XML supports configuration settings and AJAX output on various web applications.
- At PCI, XML served as a storage mechanism for quality data. XLST allowed for various presentations of that data on the Intranet.
- I designed and deployed SOAP interfaces.
- I used DTDs to generate custom APIs.
- I can syndicate any content for use in mashups, portal pages, RSS readers, and more.
- *HTML*/CSS*
- I have been using these since they were created, and my preferred editor is a text one, meaning I prefer to hand code my XHTML and CSS as it produces cleaner code.
- My work is compatible with the major browsers, whether they support W3C compliancy or not.
- I cover my experience with responsive design above. In that section, I make note of the following in particular:
- Outside of JavaScript, the ability to use CSS grid systems, use CSS media queries, use CSS class hierarchies, understand breakpoints, apply OOCSS, and create content that’s adaptable, such as flexible images, are very important for designing and developing responsively.
- Like with JavaScript and its frameworks/toolkits/libraries, great tools like Sass (Syntactically Awesome Style Sheets), Compass (CSS Authoring Framework). 960 Grid, Golden Grid (GGS), and others are enabling developers to create responsive sites much more efficiently and effectively.
- I am familiar with SEO Semantic coding and markup, and understand the importance of writing my code in this manner.
- I generally design and write XHTML or DIV style layouts for sites, utilizing CSS for markup. I can work with the older table layout method as well.
- I have worked with templating engines before, and wrote one for the UAR ABLE framework as well, which integrates into a custom CMS system. I can use any templating system you wish to if you prefer to use one.
- Graphics designers often have me take their works of art, and convert them into web-ready CSS and XHTML.
- I am confident I can use XHTML and CSS to do anything you need.
- Please see my Experience in using and interpreting web standards (browsers, accessibility, and validity) section for more relevant experience in this area.
- Amazon Web Services (AWS) and Management
AWS Certificate
- As it is one of my core competencies, I am able to help with any Amazon Web Services need you may have. The following are some examples:
- Amazon Simple Storage Service (S3):
- I have implemented S3 for all of the internal systems for Up and Running; development, testing, and production servers; as well as for many of my clients. One client is one of the largest SEO internet marketing firms in the market.
- A wrapper script was written to provide an abstraction layer for the rest of the client code. In this manner, if the API signature changes on S3's side, then I just have to change the internal implementation of the wrapper. Also, additional business logic in how S3 is utilized can be contained in the wrapper or injected at runtime to help govern operations.
- Up and Running and many of our clients have moved to S3 to reduce bandwidth consumption/needs on our servers, along with the administrative oversight and costs that need to occur for internal servers.
- Another benefit for many clients is that the S3 service serves up files quickly, usually much faster than internal servers. I find this particularly useful when a feed system is hosted on other websites. The images that make up the feed graphics can be put on S3 to save bandwidth and reduce the cost of transmission. These feeds could be viewed 10,000s of times in an hour, and would put a considerable load on the clients' dedicated hardware. Thus, in this example, S3 really saves money and results in a better user experience.
- Amazon Elastic Compute Cloud (EC2):
- I have created machine images to manage Up and Running's critical services and developer resources.
- I wrote a Perl script, utilizing the instance control API, to dynamically mount EBS volumes and attached Elastic IPs in an instance during startup using the user data scripting system. This script allows for a machine instance to be configured in such a way that configuration and user data is persisted between reboots. The mounting overlays critical areas of the operating system so that configuration changes and user data are stored on the EBS volume, making the running instance work more like a dedicated server.
- I utilized the elastic IP system to assign a public IP to the instance, and configure a round robin load balanced system when load dictates it.
- I use EBS snapshots to handle backups of EBS data, along with an rsync-based offsite backup system to protect all server data.
- Amazon SimpleDB:
- I have explored and tested this.
- I have presented this to clients, and they have not chosen to go with this solution at this time.
- Amazon Mechanical Turk
- I explored the implementation of this for a customer who needed some manual assistance in a workflow with some specifically-defined tasks.
- The API makes it easy to integrate these features into a system such that human-provided work can be submitted into the principle application.
- Other Amazon Web Services: with the experience I have using Amazon's Web Services, I am confident in my ability to quickly use any of their other or future services.
- For general Linux and Systems Administration experience, please see my Linux/Apache & Systems Administration section.
- Perl
- For an Up and Running customer, the largest amateur hockey league in the world, I wrote a team management system that ran a 900+ team hockey league that had over 10,000 user accounts.
- I currently use Perl for maintenance tasks and one-liners in command line chained commands.
- C#
- I've used C# to write an online backup tool.
- I also used this to write a client-based application to work with my web-based framework for entering time easily.
- Interfaces
- I know the methods with which applications can talk, and the abstract pros/cons of each approach. In my career I've worked with numerous APIs, interfaces, and code to make different tools work together well.
- I can configure any SOAP or REST interface needed to make a system work with other systems, or design and deploy a specific, custom API for interaction with a third-party application.
- An interface contract is an incredibly useful tool to define when building an integration between two systems. The interface contract defines the communication between the two systems’ boundaries, including how messages are initialized, what they contain, and how the receiving system responds. One of the main benefits to this is the approach, especially if both systems are being developed simultaneously, in that each group can build their side of the system boundary in parallel. As long as they adhere to the interface contract, when it comes time for the systems to interact, the message exchange will work as intended. If the interface contract needs to change, then both parties need to contribute to defining the new approach.
- In my current projects, I've been primarily using REST interfaces for system-boundary exchange because it's simple, scalable, reliable, and effective. Below are some areas that I cover or pay attention to when it comes to implementing a REST interface:
- I try to use the data model as a guide for how the REST API should be set up. I analyze the various entities, and those become the core REST object entities. Looking at associations and with some understanding of the business rules, it's a straightforward process to define how the entities relate to each other and if particular endpoints should implement specific business rules around how the data is handled.
- There are some specific guidelines regarding how a REST interface should work. It uses HTTP verbs to provide guidance on how creates, reads, deletes, and updates work against an entity. Ideally, REST is used primarily for data exchange. On some projects, I’ve helped refine how the REST layer was being used, as it was taking more of the shape of a remote-procedure-call-style interface (in a few instances, that made complete sense and we therefore made the change to XMLRPC).
- I’ll typically define a deeper URL structure to handle and store associations.
- If possible, I'll use plural and singular URL reference points for querying multiple or single records.
- In an ideal situation, I'll normally use a JSON-based structure for POST data in POST and PUT calls.
- I think it's important to return JSON responses, along with header codes.
- I'll typically handle authentication details in the header of calls after a session is established. Whether this is done or not depends on the security layer implementation.
- For social networking purposes, I've done some interesting work:
- Twitter: I've created a Drupal block for a social media portal site that uses the Twitter API to display a list of the most recent tweets on a given Twitter account. The block can be configured via the Drupal Admin Panel, and allows the website owner to enter their Twitter account details and configure several parameters regarding the display of the tweets. Additionally, I have created a Drupal module that hooks into Drupal's existing blog system, allowing the user to post an announcement to twitter automatically when they create a new blog Post. The module adds a new checkbox option to the page for creating blog entries, and, if checked, will use Twitter's API to publish a new tweet for that blog post.
- Facebook: The social media portal site is integrated with Facebook Connect for user authentication and comment posting purposes. For user authentication, this allows a visitor to log in into the site using their Facebook username and password, and does not require them to register a separate account on the social media portal. Existing Drupal users can also authenticate via Facebook Connect to take advantage of the comment posting integration. When a user posts a new comment, they have the option of automatically publishing information about that comment back to their Facebook account. Additionally, I have integrated Facebook with Drupal's blog system in the same way that Twitter is integrated with it. When a new blog post is created, the user has the option of automatically publishing information about that post back to their Facebook account.
- MySpace: A MySpace option can be presented on Drupal's blog posting form as well, giving the user the option of automatically publishing new blog posts to both of those services.
- Reddit: I have explored integration methods for this service.
- Digg: I have explored integration methods for this service.
- RSS Feeds: In addition to sending information out to social sites, the social media portal can also receive information from any site in the form of an RSS feed. The Blog owner has the ability to monitor RSS feeds from within the Drupal admin panel, and can choose to post any story from the feed into their own blog (with additional comments of their own on the story contents).
- I have explored Yahoo! Query Language (YQL): "The Yahoo! Query Language is an expressive SQL-like language that lets you query, filter, and join data across Web services." (Source: http://developer.yahoo.com/yql
)
- I have used Authorize.net's Advanced Integration Method (AIM) to allow a proprietary e-commerce site to accept credit cards using Authorize.net, while still retaining their existing order-handling business logic. Credit card information was collected from the user on the Merchant's site, and securely transmitted to Authorize.net using cURL. The site also took advantage of Authorize.net's ability to store extended order information by transmitting order details and complete customer billing and shipping information with the credit card information. The response from Authorize.net was then processed by the script, and the proper code was executed depending on whether the charge was successful or not.
- I have customers who have me attend API design meetings to act as a consultant for them and for the third-party team of programmers they meet with for such discussions.
- Examples of interfaces I've worked with:
- Google APIs
- Twitter API
- Facebook Connect
- MySpace API
- Authorize.net
- Paypal's interfaces
- Various shopping carts
- VeriSign's Payflow Pro API
- Intuit's QuickBooks Interface
- UPS WorldShip
- USPS APIs
- Another approach I’m a proponent of when integrating a third-party system (either a published API or when working with another developer or team to integrate two or more systems) is to define an interface contract between the systems. This is generally a simple system boundary diagram that has all of the exchanges documented about what the outgoing calls are and what the expected results are from that. With a published API, this is generally pretty simple since the API already has that defined, I would just implement the necessary signatures in my design to accomplish the exchange of information. When I’m designing or working with others to define an API or how two systems will exchange data, then it becomes more of an exercise to work together to define what the exchange would look like, particularly how endpoints would be consumed and react to being used. Going through this process can prevent a lot of problems and save a lot of time downstream, and it's a good investment for creating a system that is easier to maintain.
- I have additional information for some PHP-specific points about interfaces in this resume.
- CMS
- I've used several open source CMS packages and frameworks, and have extended them using custom code. Examples: Drupal, Joomla!, WordPress, and ExpressionEngine.
- When I first started using PHP in 2001, I developed my own CMS within a framework that supports many features beyond strictly CMS features. The CMS I developed handled dynamic page and navigation creation, along with version-controlled content. I could create inline editing features as needed. I no longer recommend this to clients because of the rich CMS solutions backed by strong development communities that exist today.
- I've done CMS conversions, both from static sites and CMS-driven sites.
- I've helped companies define custom CMS operational oversight processes for their enterprises.
- Drupal: PHP Content Management System and Framework - Overview of Experience:
- I can implement any theme, implement any module, design custom modules, extend existing themes and modules, perform Drupal conversions, and lead any other Drupal need you might have.
- I've done conversions to Drupal 6 from Drupal 5 and from non-Drupal sites.
- I've handled lead development duties for a production site with 8,000+ registered users, and another site that had 20,000+ page views per month.
- One week in 2009, I had three companies independently ask me to finish what other development teams had started. I understand Drupal, but my core competencies, including a deep understanding of PHP and SQL, are what allowed me to contribute to these projects. I helped my customers achieve their goals with Drupal because I know the foundational aspects of programming, specifically PHP and SQL, at the expert level.
- Here are some of things I do and have done with Drupal:
- Maintain databases from development to staging to production using a custom script written in XML and PHP.
- Implemented internationalization implementations, including Japanese and Spanish.
- Integrated the Yahoo! Grids CSS system to work as a Drupal theme, fully-customizable within the constraints of the grid system. This reduced the time and effort required to develop themes.
- Prepared ad campaigns for monetizing site usage, and implemented them using JavaScript functionality.
- Rather than implementing Drupal on a standard setup for a high traffic site, I decided to use a SAN. A SAN allows remote storage devices to be connected to a server and are viewed as locally-attached devices. This improved performance of the site, and allowed for replication of data.
- Themes; here is some of the work I've done with themes:
- I can implement new themes based off of new designs to the pixel-perfect level. I know this because I work with customers who are professional designers with excellent eyes for perfection in design.
- I have created custom themes for Drupal websites using template files, CSS, HTML, and JavaScript. My approach enables almost every view within a site to be customizable.
- Standard Implementations:
- I implemented user-oriented functionality based on Drupal's core functionality. Examples include: User Roles, Remember Me, Tracker Module, and more. Remember Me allows users to be remembered every time they login to a site. The Tracker module allows tracking of a user's posts and actions within the site.
- CCK (Content Creation Kit) is a core Drupal bit functionality I commonly use to allow administrators to define custom node types. I use CCK to help users not experienced in writing PHP/MySQL to have an interface for adding content to their sites.
- I implemented the FlashVideo module, which allows for uploading video to nodes in Drupal. The video is converted to .swf, and has built-in interfaces for the Dash Media Player and server support for hosting videos using Amazon's S3 servers.
- I have used the blocks interface to enable an administrator to define blocks of content to be shown in specific areas of pages within a site.
- I have written over 50 custom modules for Drupal; specifically, I've contributed to creating these:
- Actionable Events Module - creates a reporting system similar to what is implemented in Facebook that updates the owner of content when actions are taken on that content. For example, if a user posts a blog and a comment is posted on that blog, the owner will be alerted on their profile page.
- Anonymous Posting Module - uses Drupal hooks and allows authenticated users to post content anonymously. Records user id for the administrator to view, but for other users the content will appear to be posted by anonymous.
- Ask Module - extends the forum module to allow users to ask questions of a specific group. Uses the taxonomy categories as the forum containers.
- Comment Alter Module - extends Drupal's default commenting scheme to alter just about anything needed. It adds the no follow tag to all comments posted by untrustworthy users. It also spell-checks posts by users.
- Comment Rating Module - extends the votingapi module to allow voting on comments before this functionality was offered in the fivestar module.
- Community Site Content Creation Module - allows users to easily add products, websites, save favorite websites, ask group questions, ask individual questions, add a book to the site, and more.
- Limit Tags Module - hooks into the taxonomy functions, and limits the freetags a user is allowed to enter based on a predefined set of taxonomy tags. Has an autocomplete function built-in for users, and also suggests possible terms.
- Location Privacy Module - alters the behavior of the location module by storing user input in the database. A user can set their location to be viewable by no one, by other users marked as friends (extends the buddylist module, a contributed Drupal module), or viewable as a public listing.
- News Feed Module - mimics Facebook's news feed function in that it reports to users what their friends are doing on the site. Reports comments, posts, questions, and can be extended to update on other programmed actions. It can be digested as an RSS feed.
- Node Book Module - custom node type module that allows for users to enter a book that they like. Allows a user to enter a title, price, ISBN, and scrapes an image from Amazon.com to use as a descriptive image of the node.
- Node Generic Module - custom node type module that supports users' input of websites they prefer. Allows a user to enter a title, URL, and description of website, and attempts to scrape an applicable image from the site to use as the node's image.
- Node Privacy Module - this module allows a node to be marked as private, public, or viewable only by friends of the owner (extends the buddylist module, a contributed Drupal module).
- Node Question Module - custom node type module that mimics Yahoo!'s Answer's functionality. Questions are asked by users and are then answered by experts on the topic. Other users of the site can comment or rate (using fivestar module, a contributed Drupal module) the answer and question.
- Privatemsg Connections Module - extends the privatemsg module to make it interact with the buddylist module. This functionality allows a user to send private messages to other users who they have added as friends.
- Profiles Module - extends Drupal's default profile functionality by creating sub-navigation for the profile pages, adding extra functionality for message boards (using the guest book module), and creating customized searches that allow users to find others with similar interests.
- Questionrouter Module - extends the Node Question node type in that it sorts through the available experts on a site, and then tries to route a question to the proper expert. It performs the routing by comparing the taxonomy tags a question has with the self-defined expertise associated with the users.
- Questions and Answers Module - a module that mimics Yahoo! Answers. A user can ask a question that remains unpublished until an expert answers the question. Once answered, both are published to the community.
- SOAP Digest & Search Module - the module, connected to a web services API, uses SOAP to download the results, and display them in the Drupal site. AJAX was used to retrieve data from the server asynchronously.
- Taxonomy Module - extends the taxonomy term functionality to create groups based on terms. It provides a means to tag users and content, and allows users to add a taxonomy tag to their profile and view all content tagged with this term as part of a news feed grouping. The news feed was a custom module I developed. Its features were similar to a Facebook news feed, where a user who is part of a group can see all actions that occurred within that group. I also implemented tag clouds using the taxonomy module.
- Taxonomy Wikipedia Module - custom module that grabs definitions for taxonomy terms from Wikipedia. The definitions as well as pictures are then displayed within the homepage for that taxonomy term.
- User Tracking Module - a custom module that enables one user to track the activities of another user within a site. As an example, this permits a user to easily view all the content another user has created, and subscribe to an RSS feed of that activity.
- Various Utilities Module - handles everything from homeless hook_form_alter functions to styling search pages, profiles, forms, and blocks.
- Violater Actions Module - extends the flagged content module to allow an admin to block a user, delete content, unpublish comment, and send warning emails to troublesome users.
- Wikipedia Definition Module - attempts to add a definition based on Wikipedia content for each taxonomy category added on the site. Scrapes the content from Wikipedia, parses it, and displays it.
- In addition, I have used, modified, and/or updated the following contributed or core Drupal modules in a heavy-traffic production environment:
- Aggregator - publishes syndicated content using RSS.
- Amazontools - allows a user to input an ISBN or book title, and subsequently fetches info for that book from Amazon.com.
- Block - builds support for the block that is available in the theme. Allows an admin to add content to different areas of the page (blocks) whenever they desire. Can add new content by administering the blocks.
- Blog - builds the blog node type.
- Blogapi - extends the blog node type to allow for customizations.
- Buddylist - allows users to create lists of friends.
- Captcha - builds support for adding captchas to forms.
- Color - allows the administrator to change the color palette of a theme through the admin panel.
- Comment - builds support for the commenting functionality.
- Contact - builds functionality for creating a contact form to allow site visitors to contact the administrators.
- Devel - assists in developing, clearing caches, and updating content.
- Favorite_nodes - allows users to designate favorite items on Drupal sites.
- Fckeditor - adds support for the fckeditor, a popular WYSIWYG editor.
- Filter - allows for filtering of content. Can write custom filters to parse content, remove harmful text (code), and remove offensive words.
- Fivestar - extends the voting module to allow for a fivestar voting widget using AJAX.
- Flag_content - allows users to flag offensive content for future perusal by an administrator.
- Forum - creates the forum node type.
- Guestbook - builds support for a Facebook-style 'wall'.
- HOF - a hall of fame module that builds publicly-viewable statistics to give credit to users who cause a site to succeed.
- Invite - builds a form that allows a user to invite their friends via email.
- Karma - allows for a comment rating system based on a user's past comments. If a user has a good rating on previous comments, they are considered a trusted user.
- Help - creates help topics and text.
- Legacy - remaps deprecated style URLs to URLs usable by Drupal.
- Locale - allows for multi-language support.
- Location - allows users to input their location at the zip code or full-address levels.
- Logintoboggan - allows the administrator to set the destination page after a user logs in.
- Menu - builds support for the menu functions in Drupal.
- Node - creates support for nodes: adding, editing, moderating, deleting, searching, etc.
- Nodewords - a toned-down version of the taxonomy module that allows for meta tags on nodes.
- Path - builds support for creating readable URLs for pages normally labeled like /taxonomy/term/12/15.
- Ping - used to notify third-party sites of changes on one's site.
- Profile - builds support for the profile functions supplied by Drupal.
- Pathauto - automatically generates path aliases for nodes.
- Poormanscron - runs cron jobs on systems that don't have access to the cron application.
- Privatemsg - allows users to send private messages to each other.
- Profile_privacy - allows for users to set privacy on their profiles.
- Remember_me - uses cookies to remember users.
- Search - builds support for the Drupal search engine. Currently uses MySQL, but can use Solr.
- Statistics - creates reports on default Drupal values, and can be modified to add other site-specific stats.
- System - helps build the infrastructure Drupal is based on.
- Tagadelic - creates tag clouds with taxonomy terms.
- Taxonomy_super_select - builds a multi-select form for taxonomy terms.
- Taxonomy_xml - allows input of taxonomy terms using XML. Allows administrators to input a large number of terms.
- Taxonomy - builds tagging features.
- Throttle - allows specific modules to be throttled based on Admin decisions. If a certain module takes too much bandwidth, then this module can be used to only allow it a certain amount of processing power/bandwidth.
- Token - allows for small bits of text to be entered into documents using placeholders like "%content", as well as allowing for other modules to create their own custom tokens.
- Tracker - allows for tracking of users based on what actions they have done on the site. It can track content posts, track comments, and can be modified to track other types of actions.
- Upload - builds support for uploading files.
- User - creates support for the user object functions.
- Userpoints - a points-style module that doles out points based on rules provided by the administrator.
- Votingapi - builds the backend voting support that is extended by other modules like the fivestar module.
- Watchdog - handles the logging of errors and other declared actions to track on the site.
- Workflow_ng - allows customization of Drupal's built-in workflows.
- Xmlsitemap - creates a sitemap that conforms to the sitemaps.org specification.
- ExpressionEngine: PHP, commercial Content Management System
- Created by the same company that created CodeIgniter, a PHP MVC framework that I use heavily, I am quite comfortable using ExpressionEngine.
- I can extend ExpressionEngine using its Extensions, Plugins and Modules.
- Being that it's a commercial product, this system is well supported and has good documentation. Many open source projects are well supported and have good documentation as well, but many don't.
- I have used this for standard CMS implementations, as I like the flexibility it affords me.
- Joomla!: PHP, open source Content Management System and MVC Framework
- I can implement Joomla! as a basic CMS system, or deploy it in a way that produces a full-fledged, richly-featured site. I understand the way it operates, and can use its or the community's extensions, as well as create new ones from scratch.
- Here are a couple of examples of how I've used this system beyond basic CMS implementations:
- I've combined the built-in menu and banner modules to provide enhanced navigation options for the client.
- For a custom scheduling workflow, I provided front-end users of Joomla! the ability to show time of availability during a short period of time (two or three days) to other frontend users. Once logged-in, each front-end user is able to select time of availability from a personal page that shows a simplified personal agenda, as well as see a list of available users from a module published in the portal.
- WordPress: PHP CMS or Publishing Platform
- I am able to help you use this system for a simple blogging website, a simple business website, or as a full-fledged multi-blog business website. Wired, Yahoo!, the US Airforce, TechCrunch, Ben & Jerry's, the Wall Street Journal Magazine, and more respected and established companies use this system to drive their websites and/or blogs.
- I can implement any theme, plugin, or customization that's needed. Of course, it'll be easier if we select components that work well together from the start, but I can write new functionality to replicate or extend any features you wish.
- In terms of process, what I typically do for WordPress sites is start with the overall design. Following that, I decide on the content. Finally, I develop the theme and stylesheets to match the design, and structure it around the organization of the content. Lastly, I ensure that the content is easily maintainable and expandable. As I understand what WordPress is doing 'behind the scenes' (PHP and SQL expertise and deep system understanding), I am also able to quickly grasp WordPress's limitations, and look for alternative ways to accomplish a goal versus trying hard to find a plugin that will do the work and in a non-integrated fashion.
- I think the primary difference between people new to WordPress and ones that are gurus is the best practices. Using consistent styling, proper WordPress calls to determine context, and making the site as simple to maintain and update as possible is proper procedure. It is very easy with WordPress to just hack in changes as needed, but as those hacks accumulate, it can make quite a mess. Similarly, a clean separation of layout from appearance in the stylesheets is important when creating a theme. For a reusable theme, the two should be entirely separate so the styles can be updated to reflect a new color scheme or design without preventing maintenance updates from being applied to the theme and layout.
- I understand how to make WordPress sites SEO friendly.
- I am familiar with WordPress plugins, which allows me to help customers more efficiently by applying already-written plugins, versus creating them from scratch. I also know which plugins to avoid.
- I think, just like when choosing a project framework, WordPress is a great tool for specific jobs. At the same time, it has been used for many things that it is not good at, which frequently has resulted in migrations to other platforms or full custom site rewrites. I have learned a lot already about what WordPress is good at and what it isn't, and I think that understanding and unbiased perspective is important when planning a project. WordPress is a great blogging platform.
- MVC Experience: I would like to demonstrate my knowledge of this advanced pattern by describing how it works and how I've used it:
- This is a pattern for developing applications that makes use of reusable UI interfaces and object models. The controller glues the view and the model together by defining what parts of the model show up in the view. The view takes input, which goes through the controller for translation to the model.
- The model can then update the view through an observer pattern.
- I've used a variant of this (Model 2) in the UAR ABLE framework. The controller defines the combination of functionality that takes place for a page, and feeds the view the necessary models that are needed to make it display. I've also started developing more of a standard MVC pattern in the UAR ABLE framework, using JavaScript and the Prototype library to provide real-time updates on the client-side page.
- Besides designing my own framework, I've used the CakePHP, CodeIgniter, Symfony, ZEND, and more PHP frameworks, which use this pattern. (My experience with each of these is presented in more detail in this online resume.) I hope you please read how I respond when people ask me if I have experience in tools I have not applied for customers yet.
- ORM Experience: I would like to demonstrate my knowledge of this advanced pattern by describing how it works and how I've used it:
- This is a technique that allows the translation between relational data structures and object data structures. Generally, you see this used when you have an object model system that defines your domain knowledge and you're storing the persistent data in a relational database. The two don't always map correctly so this layer provides translation.
- There are a number of packages that can do a lot of this automatically for you through some basic configuration, but the most complete solution is usually achieved by writing a custom ORM system for the specific domain problem faced.
- I've written a custom ORM between the UAR ABLE framework's model system and the datasources that track and query persistent data.
- CakePHP: open source PHP Framework
- I used a built-in active record system to enable order management and fulfillment processes.
- I integrated MooTools ("a super lightweight web2.0 JavaScript framework") into the framework.
- I developed a small, rich text controls helper, providing autocomplete and date picker functionality.
- CodeIgniter: open source PHP Framework
- I added CMS functionality and a drag-and-drop layout system within an application that uses CodeIgniter.
- I wrote a basic ActiveRecord base object that supports the pattern better than the built-in active record system.
- I developed a custom shopping cart using AJAX.
- I created authentication, SOAP transaction, newsletter, mailer, and message tracking system modules.
- I extended the core application model to integrate an application with a FileMaker database system.
- I created a system by which a single instance of a core set of libraries, models helpers, etc is shared between multiple websites.
- I created a reusable and configurable application to handle sealed bid auctions.
- For a major international brewing company, I used CodeIgniter to handle most of the page logic and presentation organization. Specifically, I used the FX.php library to connect CodeIgniter to the backend database, where most of the data for the site is stored. I used a dynamic FTP system to create temporary accounts on the fly for a more robust upload system that can handle files multiple gigabytes in size. I used a legacy database, this was a hard project requirement, to drive the system, which made it more of a challenging project.
- For the largest wholesale baker of cakes in the US, I used CodeIgniter to handle their specific business rule and presentation needs.
- Kohana: open source PHP Framework
- This framework, originally based on CodeIgniter, is a very clean PHP MVC Framework.
- I like this framework, and hope it does well. Given that it's a younger framework, I usually use it when it's requested. Community size, support, and longevity are important considerations in choosing a PHP framework.
- Symfony: open source PHP Framework
- I know the Symfony MVC structure.
- I have experience with third-party abstraction layers included in the Symfony framework, such as the "Criteria" and the "Propel" abstraction layers. Criteria uses parameters to create an SQL query, and the Propel uses raw SQL queries instead, for more complex or specific queries.
- I've used Prototype, jQuery, and script.aculo.us, the JavaScript frameworks/libraries included with Symfony, for uses that range from simple to advanced.
- I've integrated PHP and C++ document conversion libraries with Symfony to allow multiple types of documents to be combined into a single PDF download.
- Zend: open source PHP Framework
Certification Link on Zend’s Site
| Certificate Confirmation Letter | Certificate
- A major project I was involved with used Zend to connect one customer with all of its customers so it could provide value-added services to them automatically. The system was developed so it could be applied to other sectors the primary customer serves with no retooling.
- I used Zend to create a delivery management system, including the front-end customer interface and a fully-functional back-end system for managing the business logic.
- I used components, including Zend_Forms; the Zend database tables ORM layer; and custom-written functional classes, for the model interaction.
- I tied in jQuery to provide rich, AJAX-driven features on the front-end side.
- Using a combination of jQuery and Zend, I created a robust geomatic services module to validate and geocode addresses in a restaurant delivery management system.
- I programmed a full-featured shopping cart system with support for multiple tax zones, special tax exemptions, dynamically-determined fees, digital and physical goods, and the ability to have multiple people participate in an order. All of these features integrated with Zend's database, controller, and view layers.
- I programmed a complex, dynamic fee system that allowed an administrator to specify rules that determined whether a particular fee, which could be either a flat fee or a percentage of the total order, would apply to a particular order. The rules could be based on the delivery location, the total order cost, the total number of items in the order, the item vendor, the location of the item vendor, the type of order, and the estimated time of delivery of the order. The system made use of Zend's automatic class loader to allow rapid development of the rules using a modular system.
- The shopping cart system allowed multiple users to add items to a single order, and allowed multiple users to pay for a single order. The master user was able to control how much the other users could spend, and was able to grant stipends to those users.
- I programmed an interface to validate a user's address by confirming the details of their address using MelissaData address verification and validation services.
- I used Google Maps mapping functionality to "show" the address the user was plotting, and display related restaurants close to their location.
- I programmed "radius" functionality that allows a user to define a certain search radius using their originating address as the center point. The Haversine formula was used to calculate a "zone" over a spherical area (the Earth) that corresponds to the radius designated by the user.
- E-commerce: related systems, carts, and more:
- I've worked with open source, hosted, and custom-built shopping carts.
- I've worked with payment and accounting interfaces.
- I've used Authorize.net, VeriSign's Payflow Pro API, Intuit's QuickBooks Interface, Paypal's APIs, and more in support of e-commerce financial and ordering transactions.
- I've used UPS WorldShip and USPS APIs for enhanced shipment processing.
- In addition to using CRE Loaded, Magento, and osCommerce (all described below in detail within this resume), I have used X-cart, Zen Cart, and many other commercial and open source solutions. Though it's written elsewhere in this resume, I think it useful to mention that I know systems, I know programming, and I've helped many companies implement and customize their e-commerce implementations. I can help you with any e-commerce implementation.
- E-commerce search and information access:
- I know Lucene/Solr, and can implement this open source, advanced search functionality to provide Google-like search results, with relevance percentages reported back to the user. This can be very useful for e-commerce implementations or for websites where there is a lot of data.
- I can use Endeca, a commercial, enterprise search tool to provide advanced search and information presentation capabilities. Example features I can implement: products can be presented based on availability; landing pages, category lists, navigation, labels, and more can be created dynamically; and intelligent searching enabled based on a dynamic dictionary generated and maintained by user-entered and administrator-entered data (allows for many presentations of data based on what the user meant to type; very helpful for commonly-misspelled products).
- I view e-commerce workflows as a subset of business workflows. There are two components in my opinion related to e-commerce:
- End-user presentation with calls to actions, generally to purchase or encourage more research internal to the site, and procedures to encourage more spending and loyalty.
- Back-office operations for allowing more to get done with fewer resources, and, in fact, encouraging and allowing end users to perform as much of these as possible.
- CRE Loaded: open source, PHP e-commerce system
- I can implement CRE Loaded, extend it, theme it, and help you use it for any purpose you wish.
- I have done detailed and extensive theming of CRE Loaded for the presentation layer of the system.
- I've created custom components to pull featured information from CRE Loaded for insertion into other parts of the website.
- Magento: open source e-commerce system
Certification Link on Magento’s Site
- Overview:
- I’ve been using Magento since it was created, before the company was named Magento and before it was bought by eBay, as PHP has been my near exclusive focus since 2001. I used one of the “first drafts” of Magento, osCommerce, which Varien used to work with before they made Magento. I worked directly with ex-Varien team members on customer Magento projects. Zend was used in Magento as well, which I'm certified in.
- I¹ve served JYSK
, Stoelting
, Spiewak
, SoundsTrue
, MonoMachines
and Ochimp
, AccuQuilt
( this is two other sites too: AccuCut Craft
and Top Dog Dies ), and small shops with their Magento needs. Disclaimers: For all of these, I didn't do their front-end designs or advise on e-commerce strategies. If front-end integration and development was done, it was with the graphics assets they provided and per the interaction approach that they wanted.
- I’m not aware of anything that I haven’t done with Magento on the back-end or the front-end. Whether it’s writing new modules, extending existing ones, using well-documented APIs, reverse-engineering and employing obfuscated APIs, implementing a stock theme, or making the front-end dance with the JavaScript/CSS framework or toolset of the day, I’ve done it.
- In terms of back-office integration, most of my work has been with NetSuite, though we’ve used other ERP systems too. I worked directly with an ex-developer from NetSuite, and have learned a lot from him. On occasion, since he still has peers who work there, he’s helped me troubleshoot some of the more difficult situations presented by this platform (a lot of options are limited since it’s a closed, SaaS system). Though I’m not an expert in NetSuite itself, since I consider that a domain one level removed from what I focus on, I am on some aspects related to data and efficiency, and I have second-degree peers who can help me if I need it quickly.
- I am a full-stack developer, meaning I can work on the server-side (back-end) and the client-side (front-end) of the application build out. (Moreover, I can also architect software and set up the systems, local and cloud-based, in support of it.) This means that I am not limited to any e-commerce framework. If what you’d like to do is not provided by default, configurable, or present in a respected third-party module, I can write it. No matter the transaction level you need to achieve, I can write the code and incorporate systems to support that level. Unless specifically asked not to (for example, for pilot systems, MVPs, demonstrations), all code that I write is created to be high-performance and scalable.
- I am hired to consult on the best approaches to implement an e-commerce system from a technical standpoint, implement new systems, migrate old systems, migrate from a hosted solution to a custom-built solution, migrate systems from internal servers to a cloud platform, move a system along its product roadmap, and fix issues. I know people who have decades of consulting experience in the relatively young field of e-commerce, including people with deep user experience skills (some have literally written the O’Reilly books that are used to teach the material in universities) and people who have deep e-commerce experience (one having sold one of his businesses to Oracle). I am often hired by graphic designers, usability professionals, and e-commerce consultants to handle all of the technical components of what they do. Together, we do great things, and I think we work really well together as we respect each other’s sciences and arts.
- Important points one should be aware of when developing in Magento (and much is applicable elsewhere when abstracted):
- Improve category loading time by modifying the recursion level, which is often necessary when handling a lot of categories in the admin.
- When creating a block that needs to display different content for layout handles, an action to set a parameter (which is evaluated in the block) can be created.
- Never make an override by copying the same class on a different scope.
- Never use the table names directly while using installers / upgrades. Instead, use the resource model configuration defined in the config.xml file to get the proper table name (including prefix).
- Avoid overriding controller actions to extend functionality. Use observers to add your custom functionality depending on whether the event is being dispatched.
- Never modify the Magento core. Instead, do proper rewrites in local codepool to extend/modify functionality if needed.
- Avoid modifying a Magento default package/theme. Instead, create your custom package/theme, and pull in a copy of the PHTML template files before implementing your custom modifications.
- When creating a Magento admin Grid, extend the grid block class from Mage_Adminhtml_Block_Widget_Grid, add collection to use in _prepareCollection method, and add columns in _prepareColumns method.
- Never use createBlock method inside a template. Instead, add the block in layout and use getChildHtml to call it.
- Never use $_POST / $_GET to handle parameters. Use the Request Params functionality instead.
- Use the Magento debug mode to log data for internal use in order to avoid it once a site goes live, but retain your debug functionality as needed.
- Never use visibility or stock attributes as filters directly on a Catalog Product collection. Use instead the Catalog Product Layer methods to handle them properly for Front-end pages.
- Avoid disabling modules to get overrides working. Use depends directive on the modules definition properly.
- Never remove catalogsearch.leftnav block from the catalog search if you want to keep the search working properly.
- Avoid using the Zend ORM methods to modify collections such as WHERE or JOIN methods. Use instead the defined Magento methods such as AddAttributeToFilter or AddFieldToFilter.
- When creating additional attributes for modular functionality, add them in upgrade scripts instead of creating them manually in the admin.
- Case Studies – examples of Magento work in more detail:
- For the last two years, I’ve led up all the development for a site that sells new and used technology to people who would like to buy to own the goods over time. The work includes the initial implementation, feature enhancements, and a custom-built API to process and validate data at critical steps. As buyers generally have no established credit history, no credit, or bad credit, these validation steps are of critical importance, as is the tracking of payments and processes in support of motivating payments and collections. The rent process is custom, as well as how the inventory is processed, meaning it was quite heavy on the custom business processes, resulting in custom modules being created in support of that. Examples:
- Inventory modifications: made a new system for tracking inventory with two conditional fields: serial number and inventory control. Each new inventory item has its own properties (for example: color, capacity, condition, and price). With this new inventory management system, there can be one product, such as "iPhone 4", as well as many different products based on the possible combinations of properties, each having their own serial number and inventory control. It works similar to the Configurable Product.
- Inventory status: added a status (Available, Reserved, Sold) to recognize and avoid two customers renting or buying the same product. When a customer selects all the options and clicks the button to rent or buy, the product’s state is changed to Reserved.
- Product View: modified the Product View to show all available combinations because the inventory was modified. Created a series of JavaScript classes to process all combinations of a Configurable Product. All combinations were visible, and could be sorted in different ways, such as by price.
- Rent Process: created a workflow in which the customer must complete a series of forms to input personal, financial, shipping, and damage insurance data. The customer is then presented with a contract that reflects all of this data, after which the standard Magento checkout process can be completed. Each step is connected with API Calls to the API provider.
- For an auto parts store, besides the standard e-commerce workflows, a lot of work was done to achieve the search and drill-down features that they wanted. The default Layer Navigation of Magento would not enable what they wanted to do. Example customizations:
- Unique Combinations: the Unique Inventory Control module was used without the conditions of inventory control, serial number, price, and status. Instead, just the attributes were used that compose the properties of the product, with the most important ones for them being "Model" and "Year".
- Filter Search: the filter search basically is a list of the current values of the product available based on each attribute combination. It takes into account what is in stock to generate results. For example, if someone wants a list of all parts for 2010 and 2011 Bajas, they can get that by simply selecting those parameters.
- One customer wanted the ability to generate sites for affiliates, and the module to drive this one was challenging to create. Example features:
- Site Creation: the affiliates could customize the site logo as well as the look and feel, activate and deactivate products and categories, and create banners. Each had a unique site URL that is added to the banner and used to identify this affiliate so each visitor from that affiliate would be presented with what they had customized and could be tracked easily.
- Parent/child Affiliate Relationship Management: if a visitor of an affiliate wanted to become an affiliate, this was also tracked, and the new affiliate would be a “child” of the “parent” affiliate.
- Commission Management: there was a commission’s hierarchy that was quite flexible. An administrator could set default commissions or do so by categories, products, or affiliate, as well as configure expiration periods. From this main configuration, the affiliate could share the commission with the "children" by the same parameters. A time limitation by visitor was also configurable so that if the visitor kept buying products from an affiliate, commissions would expire after that time limit.
- Reporting: this module contained many monitoring reports as well as export tools.
- Other:
- Created a custom EAV model with a set of related collections, and its attributes save different values per store. It was great to learn deeply about the database schema for this project.
- Worked on a project with many sites, many store groups by site, and many stores (more than 500) where the products and other entities could be custom for each store. This was a good experience for learning about the indexation process for optimization.
- Created an alternative-buy process, which required a deep understanding of the buy process flow, including how the associated events (quote, stock, and orders) work.
- Created a theme for Magento using SASS to make it behave responsively.
- Implemented a model extension (new abstract class and concrete classes, along with some updates in an observer) for an existing third-party API to handle new calls from same service and for a new one.
- Created a means to sell any product through a video player.
- For one project, all system areas on the front and backend were modified to include API calls to an Enterprise Service Bus Web Server. Every action regarding a product, customer, category, and checkout was modified to include an API call.
- Implemented a Single sign-on (SSO) feature, configurable in the admin.
- For the last two years, I’ve led up all the development for a site that sells new and used technology to people who would like to buy to own the goods over time. The work includes the initial implementation, feature enhancements, and a custom-built API to process and validate data at critical steps. As buyers generally have no established credit history, no credit, or bad credit, these validation steps are of critical importance, as is the tracking of payments and processes in support of motivating payments and collections. The rent process is custom, as well as how the inventory is processed, meaning it was quite heavy on the custom business processes, resulting in custom modules being created in support of that. Examples:
- Modules:
- Mainstream: I’ve used every mainstream module that I’m aware of. Here’s a sampling, not all-inclusive, of ones I’ve used:
- Abandoed Carts Alerts Pro: http://www.magentocommerce.com/magento-connect/abandoned-carts-alerts-pro-1.html
- aheadWorks AJAX Cart Pro: http://ecommerce.aheadworks.com/magento-extensions/ajax-cart-pro.html
- aheadWorks Follow Up Email: http://www.magentocommerce.com/magento-connect/follow-up-email-by-aheadworks.html
- aheadWorks Product Color Swatches: http://ecommerce.aheadworks.com/magento-extensions/product-color-swatches.html
- Amasty Custom Stock Status: http://amasty.com/custom-stock-status.html
- Authorize.net CIM-Certified by Authorize.Net-Payment Module: http://www.magentocommerce.com/magento-connect/authorize-net-cim-certified-by-authorize-net-payment-module.html
- Blank Theme: http://www.magentocommerce.com/magento-connect/blank-theme.html
- Dataflow Batch Import + Export Orders To CSV / XML: http://www.magentocommerce.com/magento-connect/dataflow-batch-import-export-orders-to-csv-xml.html
- Easy Lightbox 2.0: http://www.magentocommerce.com/magento-connect/easy-lightbox-2-0-free-magento-extension.html
- Ebizmarts Sage Pay Suite Pro: http://www.magentocommerce.com/magento-connect/ebizmarts-sage-pay-suite-ce-europe-free-sagepay-official-extension.html
- Enhanced Admin Product Grid: http://www.magentocommerce.com/magento-connect/enhanced-admin-product-grid.html
- Fooman Google Analytics +: http://www.magentocommerce.com/magento-connect/fooman-google-analytics.html
- Fooman Speedster: http://www.magentocommerce.com/magento-connect/fooman-speedster.html
- Layered Navigation Pro: http://www.magentocommerce.com/magento-connect/layered-navigation-pro.html
- Magento WordPress Integration: http://www.magentocommerce.com/magento-connect/magento-wordpress-integration.html
- Magestore Banner: http://www.magestore.com/magento-banner-slider-extension.html
- Magestore Gift Card for Magento: http://www.magentocommerce.com/magento-connect/gift-card-for-magento.html
- Magestore Mega Menu: http://www.magestore.com/magento-mega-menu-extension.html
- Magic Zoom: http://www.magentocommerce.com/magento-connect/magic-zoomtm.html
- One Step Checkout: http://www.magentocommerce.com/magento-connect/one-step-checkout-v4.html
- ProNav (Mega Dropdown): http://www.magentocommerce.com/magento-connect/pronav-mega-dropdown.html
- Simple Configurable Products: http://www.magentocommerce.com/magento-connect/simple-configurable-products.html
- Sweet Tooth Loyalty & Reward Points: https://www.sweettoothrewards.com
- Unirgy Dropship: http://www.magentocommerce.com/magento-connect/unirgy-dropship.html
- Unirgy StoreLocator: https://secure.unirgy.com/products/ustorelocator
- Xtento Order Export: http://www.xtento.com/magento-extensions/magento-order-export-module.html
- Abandoed Carts Alerts Pro: http://www.magentocommerce.com/magento-connect/abandoned-carts-alerts-pro-1.html
- Custom: Here is a sampling of modules that I’ve worked on over the years. This includes writing modules from scratch, as well as extending existing ones:
- AccessoriesNewsletter: automatically sends newsletters to past customers with accessories related to the previously-purchased products.
- AddFreeProducttoCart: this module extends the native Magento functionality to automatically add a product to a cart, meaning an item(s) will appear in the shopping cart based on a set of rules-based criteria.
- AgeGate: shows a popup requesting the customer’s age after the first product is added to the cart.
- AJAXCartPro: this module makes the Magento UI more customer-friendly as preferred by that client by means of AJAX.
- AttributesInstaller: installs all custom attributes for the site.
- AutoSuggest: integrates search suggestions with Solr.
- Banners: administers banners in different sections of the site by banner category.
- Blog: provides blogging features, such as the ability to turn the blog on or off for different stores, a latest-posts widget, a native Magento WYSIWYG editor, tagging, and a comments-per-page option.
- BoughtProducts: shows a list of products often bought with the current product being viewed or purchased.
- Brands: presents a brand list with links to search within each brand store or present product listings by brand along with brand images.
- CartReservation: enables reserving products before buying them.
- CasePrices: handles different prices for different cases of products (volume ordering).
- CategoryGroups: adds functionality to make category groups and renders those on the frontend.
- CategoryTree: customizes the category tree default functionality for Magento to handle many categories at a speed the customer wants.
- ColorSwatchesPro: presents configurable products differently by animating text attributes with images.
- ContentInstaller: programmatically creates all stores, websites, attribute sets, and CMS pages.
- CustomAttributes: adds custom attributes to the site using installers.
- CustomReports: presents different reports in the admin. This was requested by the sales department.
- DelayedDelivery: initiates delivery times based on administrative configurations.
- Discounts: adds all products with discounts in one category and removes them when the discount expires.
- EnvironmentalFriendly: presents environmentally-friendly products tailored to the customer’s home state.
- FinalDashboard: offers order reports with custom filters.
- FreeShipping: adds functionality to show a badge when a product has free shipping.
- GrossReport: admin functionality that generates a gross report per order in a table with custom filters.
- HeroGraphic: handles sliders for frontend presentation, configurable in the Magento admin.
- JobOpportunities: adds functionality to CMS pages (job opportunities pages) to upload resumes.
- LowStockAlert: sends an email of products that are approaching a low-stock state. This could be triggered by many parameters by product.
- MageMonkey: integration of MailChimp with Magento.
- ManaPro: improves Magento’s Layered Navigation.
- Monthly payments: presents the number of monthly installments required for a product to be paid off.
- MultipleSites: displays different themes for pages based on custom rules.
- OneStepCheckout: simplifies the checkout process of the Magento store.
- OrderCost: imports order costing for all orders.
- ProductColorSwatches: replaces product options with swatches and shows the appropriate product image based on the selected attributes.
- ProductList: creates a widget type “Product List” using various filters, and it can be rendered anywhere.
- ProductRegistration: adds the possibility for registering products. The customer can log in and register products or can create an account and register products. The customer can view the registered products from their account, and the registered products are displayed in the Magento admin area. The admin can upload a file with valid serial numbers for products that can be registered.
- ProductUpdate: permits the updating of products from files using cron.
- Rental Approval: enables a custom rental process by product including quotes, order management, and inventory data. It used an external API to request the validation needed to rent the product.
- Rewards: several modules that extend Magento functionality to reward customers with loyalty points based on defined actions such as purchases, referrals, sharing products, sharing purchases, product reviews and ratings, account signups, and newsletter subscriptions. Everything can be managed from the admin.
- SOAPOrderUpload: connects Magento with an external ERP.
- SocialMedia: generates a social feed for different social networks in one place ordered by time.
- SoldBy: displays who is selling the product based on a product attribute.
- SportsWearSiteCartEnhancements: handles a cart’s expiration time and overwrote the AJAX service in the checkout process to dynamically present the payment and shipping methods according to the country selected.
- Spotlight: adds a feature like Pinterest to Magento to present a new section with projects (new entities not using EAV) created by users and displayed on the frontend. For each project, a user can add several images related to his/her project, a description, and products associated with it. It also enabled album creation and product association.
- SpotlightTagging: Allows admin users to quickly add custom tagging to the products using a group of grids and forms. The tags are updated on the product pages via this process. This included a report to review tagging history by user.
- Subcategories: displays the first subcategories level of the current category depending on the category level.
- Translator: used to translate content such as CMS blocks and specific product data like the description. This translation service used an API to send the content and receive the translation, with each translation resulting in a fee. The admin could monitor the translation activities as well as run reports related to it.
- UniqueInventory: handles the product stock independently, adding more attributes for each product (like color, condition, and capacity) and creating a new unique element to represent this set of properties. It would also book the product when the user chose it and free it after the expiration period.
- UpdatedPrices: sends email alerts when product prices are modified and creates a log of all changes in a custom table.
- Warranty: adds different types of warranties to products and product categories to handle the various warranties a store might offer.
- Mainstream: I’ve used every mainstream module that I’m aware of. Here’s a sampling, not all-inclusive, of ones I’ve used:
- Third-party systems, toolsets, and frameworks I’ve used with Magento:
- Commerce Bug: a tool from Pulse Storm that presents really helpful information about each page/request, such as the names, handles, paths, and models. Link: http://store.pulsestorm.net/products/commerce-bug-2
- Magento Profiler: “[This] is a drop-in replacement for the Varien_Profiler. It can be activated in the configuration and doesn’t require you to change any core files. It captures all data and also records its hierarchal information and displays everything in a nice way.” Source: https://github.com/fbrnc/Aoe_Profiler
- Magento TAF: “The Magento Test Automation Framework (MTAF) is a system of software tools used for running repeatable functional tests against the Magento application being tested. MTAF is used for both writing test automation scripts and for performing the actual testing. Test automation scripts created within the framework can be used for testing most Magento functionality which does not relate to an external system. This is a cross-platform solution (not dependent on a specific operating system). MTAF allows QA specialists to quickly develop all kinds of tests for the current Magento version, and the tests can be reused at any time. Framework users can run a single test independently, a bunch of tests together (a test suite), or all available tests.” Source: MTAF Installation Guide (pdf)
- Magicento in PHPStorm: “Magicento is a PHPStorm plugin for Magento developers. Features include: goto for factories and template paths, autocomplete for factories, xml files and class names, documentation for xml nodes, evaluation of PHP code inside Magento environment, and much more!” Source: http://magicento.com
- Magmi: This is a useful tool for importing data, permitting an import using direct SQL. Other features: “Automatable through CLI or curl/wget, flexible CSV format support [Dataflow based, subset of it & extensions], computed values through Value Replacer plugins, [handling] multiple stores configurations, images import (remote & local) through Image Processor Plugin, can import ‘customizable options’ through custom options plugin, can import tier prices, can create categories on the fly based on name / tree description, will automatically create select/multiselect option values based on imported data, [and] can be integrated in custom PHP scripts through ‘Datapump API’ feature.” Source: http://sourceforge.net/projects/magmi
- N98MageRun: The n98 magerun Command Line Interface (CLI) tools provide some useful ways to work with Magento, such as: “a database dump feature, cache clearing, [and] admin user password reset. You can also easily install a complete shop and sample data with [the built-in] installer. There are also many features like a module kickstarter (with modman support).” Source: https://github.com/netz98/n98-magerun
and http://magerun.net
- Xdebug: A helpful debugging and profiling tool. Link: http://xdebug.org
- Other:
- All mainstream JavaScript frameworks like jQuery, AngularJS, Node.js, etc.
- All mainstream CSS frameworks like SASS, Canvas, etc.
- All mainstream PHP frameworks, including Zend, Laravel, Kohana, CodeIgniter, Fuel PHP, etc.
- All mainstream PHP CMS or CMS frameworks, including WordPress, Drupal, TYPO3, Expression Engine, etc.
- Commerce Bug: a tool from Pulse Storm that presents really helpful information about each page/request, such as the names, handles, paths, and models. Link: http://store.pulsestorm.net/products/commerce-bug-2
- APIs: almost every Magento site I was involved with uses APIs. Here are some examples:
- Magento API: I have used all available methods of Magento API.
- One such API created the XML files with the information related to the rental process, made the requests, and handled the response to validate the data. Here is a brief description:
- Send the user information and the data related to the payment -> Receive the information about whether the user is allowed to rent
- Send a request to create a transaction with the user data received from the API -> Receive the transaction information
- Send the rental information (product, shipping method, warranty) -> Receive the approval to make the operation and a document with terms and conditions
- Send the confirmation after the order is created -> Receive a final operation ID
- One Hour Translation: This API was used to translate words in different languages, sending JSON requests through a group of callbacks. Here is a brief description:
- Send the authentication request -> Receive a confirmation
- Send content and an estimate request -> Receive a price for the translation
- Send the confirmation for translation -> Receive the translated content
- USPS and FEDEX: address verification services.
- Payment Services: PayPal, Amazon Payments, Authorize.net, and many other merchant accounts.
- MACH Software: order Management activities.
- Health check software: monitor live sites.
- Model extension (new abstract class and concrete classes, along with some updates in an observer) for an existing third-party API to handle new calls from same service and for a new one.
- For one project, all system areas on the front and backend were modified to include API calls to an Enterprise Service Bus Web Server. Every action regarding a product, customer, category, checkout was modified to include an API call.
- Search: I’ve used existing search features, extended existing search features, written new search features, and have implemented other ways of searching the data, such as using Solr.
- osCommerce: open source, PHP e-commerce system
- I can implement osCommerce, extend it, theme it, and help you use it for any purpose you wish.
- I have done detailed and extensive theming of osCommerce for the presentation layer of the system.
- I created a shipping module to provide international USPS support.
- Social Networking: please see the work I've outlined in my Interfaces section of my resume.
- Google Maps: here is some information about my Google Maps knowledge and application experience in general:
- Programmed an interface to validate a user's address by confirming the details of their address using MelissaData address verification and validation services.
- Used GoogleMaps mapping functionality to "show" the user the address they were plotting, and display related restaurants close to their location.
- Programmed "radius" functionality that would allow a user to define a certain search radius using their originating address as the center point. The Haversine formula was used to calculate a "zone" over a spherical area (the Earth) that corresponds to the radius designated by the user.
- Designed and implemented Google Maps into a vacation website.
- Created user interface to add properties, such as name, address, and zip code.
- Integrated Google Maps geocode function to convert address/zip code into longitude and latitude coordinates.
- Displayed locations on Google Maps using custom icons to convey the type of property.
- List of properties displayed outside of map was grouped with a map legend of property types.
- Created hyperlinks (with custom icon) outside of the map area that, once clicked, displayed property images and location details on the map, along with a hyperlink to the target website.
- Template Engines
- I prefer to program templates in pure PHP instead of using another "programming language" to program the template logic. However, I understand there is great value to be gained by using a template engine for some projects.
- I have used Smarty to develop a highly-dynamic website that places an emphasis on the proper separation of business logic and presentation logic. This separation made modifications to the site far easier for the users involved, and reduced the amount of time required to implement feature additions and changes for the users involved. It permitted the site to rapidly evolve as its visitors requested additional functionality. Smarty provided the site with a feature rich "plug and play" template system, while still providing scalability thanks to its built in caching system. It was a good solution for the project.
- I've used others beyond Smarty, and, given my focus on PHP and core programming fundamentals, I think I can use any PHP-based template engine if you think this is the right approach for your project.
- Widgets
- I have created custom widgets, widget management systems, and security systems to manage access to widgets. For the latter, I designed a system to communicate with Active Directory, which is a Microsoft technology for managing authentication in addition to other network services. Thus, clients only need to manage one set of users and permissions; that is, what is defined in terms of access and security for the internal network is propagated to the Intranet and Internet features as well.
- I think it's useful to provide my definition of this term. Technically, a widget could really be anything in terms of programming and the web. Most would state it's a small component that does a specific action or displays a specific piece of data. Often, it's deployable by non-developers with ease. To me, it's something that is either independent, or provides support to the main purpose of the currently-viewed page.
- Here are some examples of widgets:
- Drupal's block system is a good example of a widget. A block is a snippet of HTML that is either static or dynamically-generated. It usually displays some info, or provides access to additional features based on the page you're working with.
- A sign-on form for logging into an account on a website.
- The chat box that appears on the left-hand side of Google Apps and the iGoogle homepage.
- Here are some widgets I've implemented:
- Widgets for performing work and seeing data, including a cafeteria widget, a scheduling widget, and a quality-reporting widget.
- Widgets for administrators to change the look and feel of a web site using drag-and-drop functionality.
- A number of tools, including quick-add features, login features, navigation links, page actions, CRM search, language selection, etc.
- Mac and PC/Windows Server Experience
- Mac:
- I have programmed web services to ensure they work in Safari (though Safari usually handles compliancy well).
- I know Linux, which is based on Unix, and Mac O/S is based on BSD.
- I run an Apple Mac Pro as my primary desktop machine. It's a quad-core 2.9 GHz system with 8GB of RAM. It runs two NVIDIA cards that handle my four monitors. I like it because it provides all the power of the BSD command line while providing a nice, polished graphical interface. Best of both worlds.
- For my primary laptop, I use an Apple Macbook Pro. It's a Core 2 duo 2.8 GHz system with 8GB of RAM. It provides a good companion to the desktop, as I can use the same environment and applications for development on both machines.
- PC and Windows Server:
- I have programmed web services to ensure they work on the most popular PC-based web browsers.
- I've supported or managed the support of PCs and server environments since 1995.
- Prior to putting my focus on software development, I have done many Windows Server implementations. Examples of what I can do: Multi-site Active Directory, Group Policy with advanced .vbs and .bat logon scripts, DNS/DHCP, Distributed File System, Routing & Remote Access for VPN and WAN connectivity, Exchange Server, SharePoint Portal Server, and Internet Information Server. Having this knowledge really helps me connect my web services to Microsoft-driven services. For instance, I can have the authentication system of an Intranet or Internet site be driven entirely off of Active Directory.
- Mac:
- Linux/Apache & Systems Administration
- My Linux and Apache systems administration skills are advanced.
- For AWS experience, please see my Amazon Web Services (AWS) and Management section.
- I've managed Apache and sendmail processes since 2001 at Up and Running.
- I've employed Nginx professionally for years so I have LEMP covered as well as LAMP. Example implementations:
- Configured it for use in static content hosting.
- Configured it to function with PHP through the use of FPM for CGI processing.
- Set up Nginx to work as a reverse proxy with Apache. Nginx took care of static hosting and Apache served dynamic content using the mod_php module. This provided better resource management.
- I maintain a virtualized machine that hosts redundant firewalls, production, test, database, and staging machines.
- I use memcached, which "is a high-performance, distributed memory object caching system, generic in nature, but intended for use in speeding up dynamic web applications by alleviating database load." (Source: http://www.danga.com/memcached
)
- I have used apache mod_rewrite to protect a web accessible folder and modify the requested URL to call a wrapper script that can use a web application's security system to access the file. This is beneficial in that standard FTP and HTTP file operations on the webserver continue to work as defined, and are enforced by the web application.
- Most web applications I've written also use email communication, and I've written a set of routines in PHP to handle these tasks.
- Examples of services I've deployed:
- Asterisk solutions, including custom programming.
- I have designed a centralized HA Asterisk environment that will accept and authenticate calls from remote PBXs on the same VPN using SIP registrations. This HA solution utilizes Heartbeat (formerly Linux-HA), DRBD, and custom socket monitoring scripts for STONITH for failover between nodes. All remote PBXs register to this cluster, and all inbound and outbound traffic as defined in their dialplans is pushed via 120 SIP trunks from the service provider. This is a robust, scalable solution (in its current incarnation it can support ~20 remote nodes), and is ideal for rural environments.
- Multi-site Asterisk solution deployments.
- Unified messaging: the ability to capture many different forms of communication (fax, email, voicemails, SMS, etc), and deliver them to a single location, accessible by various devices, including computers, smart phones, phones, web accessible devices, etc.
- Voicemails can be emailed to one's inbox, and filed electronically.
- Extensions that allow you transfer calls to cell, home, and other phones, making it seamless for the caller.
- Custom menus for easy navigation for commonly-used features.
- Remote access to phone and messaging system over a secured internet connection.
- Custom software for call accounting and tracking for a billing system.
- Postfix/Sendmail services.
- POP3/IMAP mail servers.
- MySQL and PostgreSQL database servers.
- BIND master/slave DNS servers and replication.
- Samba file shares for cross-platform access.
- Samba Windows domain emulation.
- Set up commercially-signed certificates.
- Intrusion detection services.
- FTP/SSH services.
- VPN systems.
- openVPN.
- IPSec tunnels.
- IPTable setups for complex router configurations.
- Version control solutions for programming or updated documentation projects.
- Open source backup solutions.
- Server monitoring tools.
- Workstation environments for office workers, programmers, and system administrators.
- VMware infrastructures that host redundant firewalls, production, test, database, and staging machines.
- Distributed and grid computing.
- Deployment of co-located servers for customers.
- Distributions I've used:
- Cent OS
- Debian
- FreeBSD
- Gentoo
- Knoppix
- Slackware
- SuSE
- Red Hat
- RHEL
- Ubuntu
- Performance and Scalability Experience:
- I build performance and scalability into my systems because it's natural for me to do so. Best practices lend to a scalable and high-performance system, and these must be implemented and followed at all levels of a software development project: at the design and architecture level, at the functional (programming) level, within the database setup, within the systems administration configuration, and within the business processes themselves.
- I also know when performance and scalability design should be ignored. Of course, if the system is only used for a prototype or concept or for a small set of users, then the right amount of resources and best practices must be used so that value, as my client defines value, is maximized.
- To demonstrate my knowledge of scalability and performance measures and best practices, I'd like to include some specific examples:
- Cache when you can: Memcache is quite fast, and helpful in this respect. I've also used Memcache (and Memcachedb) as a store for common info like user accounts, profile data, etc. as it scales much better.
- One has to be mindful of the size of objects you cache in session as well. Course grain calls make sense to minimize DB resource consumption, but composite or grouped objects can consume substantial resources on high traffic sites, even those clustered and load balanced.
- Serve only dynamic pages in Apache: offload the static content to lighttpd+varnish or a CDN like Amazon S3 or Akamai. Contrary to popular belief, Apache+mod_php is usually faster than using a FastCGI implementation.
- Latency is key: one has to be very careful with web service calls and blocking SQL statements. Offload things like sending email, processing credit cards, etc. to background processes dedicated to this so you don't have Apache sessions hanging open doing nothing. They're expensive.
- Apache needs the correct number of client connections set, which is usually the amount of RAM available / resident size so you don't swap.
- Statically compile all modules you use (small gain), and remove all modules that you can (resident size is very important; many configs have all modules dynamically loaded via LoadModule, which doubles the size of Apache). This might result in your needing to double your capacity to handle connections.
- Sharding and partitioning: once you hit a certain point, it becomes very difficult to scale MySQL. It's important to logically partition/shard your data; with the 300m users I worked with on one project, the user database I developed was sharded into 8 physical MySQL clusters, each with a master and slave.
- For full-text search, Sphinx is blindingly fast, but it can also be used for fast lookups like email -> uid to stay off the MySQL servers. This is especially good if you have sharding; it's a great index telling you which partition to find a user/object/etc on.
- In PHP, there's plenty to avoid. Most common frameworks are not designed to deal with a large load, and templating systems are just attempts to reimplement PHP in PHP anyway (XSLT is especially bad). Avoid dynamically-loading classes, class_exists(), and relying on include paths - use a constant like MYLIBPATH and include all things relative to that.
- JSON (and serialized PHP if you can stick to one platform) is a lot faster for IPC than XML.
- Statistics tables versus counting/grouping dynamically: instead of counting how many products the customer has purchased each time, store that count, and update it when it changes. Keep triggers/stored procedures/etc out if at all possible.
- Avoid CURL calls and microtime any web service calls you depend on.
- Be mindful of the impact in the architectural phase of external dependencies: Google (especially Analytics), Amazon/A9, Akamai, and ad servers are notorious blockers with high average latency impacts.
- Autoloading class frameworks are critical in large PHP applications to avoid high procedural processing overheads. Also, only lazy load objects as needed.
- There is a wide variety of settings (maxconnections, requestsperconnection, filehandles, modules loaded, etc.) that can greatly impact the performance of an instance of Apache.
- Run EXPLAIN on each query over a certain time threshold.
- Design a normalized schema, and then break that normalization if needed to gain efficiency increases (normalization is a guideline, not a hard and fast rule).
- Seek low overhead on calls that are meant to be fast: AJAX server-side code, for example, should be extremely lightweight.
- Example applications of my performance and scalability knowledge:
- The Travelling Prince (no longer in operation):
- Description: I implemented the site layout, database structure, and the business logic for a Home Exchange (where two members stay in each other's homes for a vacation). The site has a comprehensive search feature that matches visitors who want to visit a particular location with visitors who own a property in that location. It features searching and filtering on many facets, including location, date of visit, duration of visit, as well as dozens of other tags and parameters. Visitors have the option of performing an initial search and then drilling down, or expanding their search criteria if they find too many or too few records within their initial search. In addition to the primary search run by the user, the system also performs "border" searches that find similar results by slightly modifying the parameters provided by a user.
- Database Size: The site uses multiple large database tables, including a 7 million record Geonames database and a large table of properties.
- Number of Users: The site has been designed to support hundreds of concurrent users and as many as 100,000 properties in almost 7 million different locations.
- Relation to Performance Optimization: The search functionality involves very intensive queries to the property and location databases in order to locate both direct and "border" results. In order to maximize the performance of the search functionality, I use multiple, heavily-indexed cache tables. These cache tables provide a very noticeable performance improvement over querying the data tables directly, and allow searches to be performed in seconds instead of minutes.
- www.voices.com
- Description: I worked on the user interface design (not graphics, but the processes for visitors to use the site), database design, administration features, shopping cart, member sign-up, API for partners, and the communication system (internal email combines with external email). Also, the job process for this client is extremely complex, allowing for milestones by job, deposits tied to the milestones, releasing funds at certain stages, managing the agreement between the parties in the job, maintaining the data files, and attributing the file with the right step in the process. Other features of the site I contributed to include a profile section, feedback section, and search. The database member search functionality permits an extensive filter selection, and these can be drilled down into or selected on the top level. Corresponding data is pulled according to the search criteria and the desired results. The search feature is the guest feature most often used.
- Database Size: 1.2 GiB; 6.7 million records.
- Number of Users: The site has 65 thousand registered members. On average, thousands visit daily.
- Relation to Performance Optimization: The home page utilizes some very SQL intensive functions that were taking up to 90 seconds to perform. I optimized the page, getting the search results to load in under 1 second. Furthermore, on the admin and internal parts of the site, I optimized the functions that interact with the database, both by changing the structure of the database and the server configuration, as well as the code for the search. The work minimized the return times on complex reporting, and allowed the site to function significantly better during peak times (when new job notices go out and everyone is logged in to submit their bids for them).
- Can't Name Company due to NDA:
- Description: Consistently a top 100 site in Alexa rankings. I used my best practices to contribute to their backend architecture in the same manner I would for any client.
- Database Size: I don't know, but they get over 30 million unique visitors/month.
- Number of Users: I measure users in millions of DAUs (daily active users).
- Relation to Performance Optimization: High relation.
- The Travelling Prince (no longer in operation):
- Systems Architect Experience
- I have been the lead Systems Architect at Up and Running for technology solutions since its beginning in 1995. I have served in this role for many Up and Running clients and for previous employers.
- I know both worlds. I have technical skills at the micro and macro levels, and I can move between specialist and generalist-level thinking quickly as needed. In summary, I can implement and architect, which I believe allows me to produce more in either area.
- I believe the ability to organize a system arrives after understanding the processes and components of that system, in this case the system that enables the development, deployment, and maintenance of software. There are the technical aspects (hardware and software), the process aspects (methodologies for software development and communication, systems architecture, managing people, managing projects, client interaction, and more), and finally and most importantly, the ability to assemble a good team and create the environment that enables and promotes success. All of these must be done while balancing scope, money, and resources. Each project offers its own challenges and rewards as each project requires its own mixture of these elements.
- I know how to research options for all aspects of a software project and present these options to stakeholders intelligently. I know that each stakeholder has his or her decision factors, and that it's important that they be able to assess options easily and quickly. I also understand that I should give my recommendation if it's applicable and state pros and cons of a particular decision as it relates to the software development teams' effectiveness.
- Experience in using and interpreting web standards (browsers, accessibility, and validity)
- I develop according to W3C standards, and write code to enable browsers that do not always conform to W3C standards.
- I stick to the standards as much as possible and when I can't do that, I use industry wide/supported alternatives to achieve the functionality I'm after.
- I use transitional HTML 4.0 or transitional XHTML 1.0 doc types for my pages. This limits the occurrences of "quirks mode" being enabled, reducing most of the display discrepancies experienced between browser platforms.
- I also design my pages in a DIV/CSS style layout, making it very easy to present the page differently using alternate style sheets. This same design enables me to write for screen readers or brail readers just as easily. I can also format pages to display on handhelds.
- My framework can also support alternate configurations of the data for different viewing mechanisms. For instance, I can provide a full Web 2.0 interface for regular browsers, and, with the same data, have a complete set of alternate views for use with a PDA or screen reader, which typically cannot handle JavaScript.
- Graphics Design
- I have studied UI design, and have been applying it for years; I believe I've helped improve navigation and usability for many websites.
- I have basic graphics development skills (Adobe products & GIMP), though I prefer to seek design help for graphics, content and the more aesthetic aspects of website creation. I enjoy more and believe I contribute most to working on the engine of the software system.
- I do understand design in the sense that I know it's challenging, is complex, and requires the mastery of several arts, similar to software development. I have enough knowledge to communicate well with designers and graphics development teams. I have served as the software development contact for Design businesses and Communications departments. When our arts are combined, I think great things happen.
- SEO/SEM: I have working knowledge of this. However, to do this extremely well, depending on the site, this could easily be a full-time position in and of itself.
- Tools: These are the tools I use to organize work, and get work done faster. I've used many, and can adapt to what your preferences are.
- Version Control System or Source Configuration Management (SCM): I primarily use Git to manage the codebases I'm responsible for. There are a few Subversion (SVN) repositories I work with as well because I adapt to the tools that clients like to use. I have been asked by those I serve to to install, configure, and establish best practices when using these tools. Regarding Git specifically:
- The feature branch approach is the one I usually use. Basically, this means branching off of a known stable branch (master, production, etc.) to contain the work for a given ticket, fix, or feature. Branches can easily be pushed for peer code review. When the code is signed off on, it can be merged into one of the stable branches for deployment to a specific environment.
- I typically track different environments for an application in different branches in Git. I normally use the master as the main stable line of development, a staging branch (using tags for specific deployments), and production or production release branches for production deployments (again, tagging along the way). This is the baseline approach, and other approaches can easily be used to support specific development workflows and stakeholder needs.
- I have set up and used Gitolite and GitLab services for hosting Git repositories.
- It's been part of my responsibilities to sync and coordinate products cross different remotes for exchanging project data with different stakeholders.
- Collaborative Software: wikis, SharePoint, Google Sites, Basecamp, Exchange, and custom Intranets.
- Communication: email, VOIP on an internal Asterisk solution, Treo with Jawbone headset, and Pidgin with most IM accounts.
- Desktop Sharing: Ultra VNC and CrossLoop.
- Bug Tracking: Mantis and FogBugz. I have used Bugzilla, RT, and Flyspray.
- Project Management: spreadsheets (I believe Excel or Google Docs can accomplish most PM needs; people and processes are more important than the tool.), Microsoft Project, SharePoint, BaseCamp, and custom systems.
- Browser Testing: Firefox's Web Developer Toolbar and Firebug and IE's Web Developer Toolbar. I also run VMware on my Gentoo computer to QA systems on various browser versions: Windows XP Pro with IE7, Windows 2000 with IE6, Ubuntu with Opera and Konqueror, and another Gentoo environment for a clean development environment.
- Editors and/or Integrated Development Environments (IDE): Gvim/Vim, Eclipse, Visual Studio, and the Linux Command Line.
- Version Control System or Source Configuration Management (SCM): I primarily use Git to manage the codebases I'm responsible for. There are a few Subversion (SVN) repositories I work with as well because I adapt to the tools that clients like to use. I have been asked by those I serve to to install, configure, and establish best practices when using these tools. Regarding Git specifically:
- Advanced Search Solutions:
- I've written a few full-text index search systems that index content information in MySQL and PostgreSQL. The most complicated search system I've written is within my custom framework; it can take a system object, use reflection to process its getter methods and child relationships to generate an index list, and store that in a generic search table. A system-level search then allows those object items to be pulled up off of keywords, and directs the user to correct views that render that object's data. It's fast, flexible, and allows for any model data to be searched for and displayed.
- I've implemented and customized Google Search appliances for customers.
- I know Lucene/Solr, and can implement this open source, advanced search functionality to provide Google-like search results, with relevance percentages reported back to the user. This can be very useful for e-commerce implementations or for websites where there is a lot of data. As an example, I implemented this for a job and candidate management site; the result was that they had far more transparency into their job and candidate data, which helped them and their end users accomplish their objectives faster and with more accuracy (better matching between job needs and candidate capabilities).
- I know Endeca, a commercial, enterprise search tool used to provide advanced search and information presentation capabilities. Example features I can implement: products can be presented based on availability; landing pages, category lists, navigation, labels, and more can be created dynamically; and intelligent searching enabled based on a dynamic dictionary generated and maintained by user-entered and administrator-entered data (allows for many presentations of data based on what the user meant to type; very helpful for commonly-misspelled products).
- Security Knowledge
- I've worked with highly-sensitive and secure materials, including financial data, healthcare data, and data formally classified as sensitive by the government and vendors who serve the government.
- I have run white hat security penetration testing for customers.
- I've rebuilt several compromised servers. (For the record, I didn't set them up.)
- I have read most of Kevin Mitnick's books, and have helped one customer organization develop its social engineering policies.
- Licensing
- I utilize open source products in my applications that are business friendly.
- I restrict work with GPL, OS 3.0 or other viral licenses to work that will comply with the business' objectives without compromising their intellectual property.
- I favor LGPL-like license terms as these permit the distribution of the software without the requirement that the intellectual property added around the licensed software be compromised or licensed under the open source license agreements.
- I provide recommendations and advice on the various open source licenses and the impact utilizing the software under these licenses will have on the project's IP rights.
- I strongly support proper software IP management for reduced liability and compliance throughout the project.
- Technologies that I'm not an expert at and that I'm often asked about: Here's how I usually answer when asked if I know a tool based on programming languages I know: understanding the primary underlying technologies (PHP & SQL, for example) is the key. My learning curve is more of a flat line at this point in my career when it comes to using most tools. Another way to present this is: I have theory and experience across many tools, languages, and processes that I can apply regardless of the new widget, tool, or methodology. I'm not claiming to be an expert at everything, rather a highly-adaptable professional when it comes to any tool related to web services.

Other Relevant Information
- Usability: I subscribe to Jakob Nielson's Alertbox, and have studied user interface design. My primary concerns are usability and flexibility when creating a UI. I have read these books so that I can work better with copywriters and designers, and design better myself:
- "The Design of Everyday Things" Amazon link
- "The Inmates Are Running the Asylum" Amazon link
- "Homepage Usability: 50 Websites Deconstructed" (by Jakob Nielson) Amazon link
- "Designing Interfaces: Patterns for Effective Interaction Design" Amazon link
- "About Face 3: The Essentials of Interaction Design" Amazon link
- "Observing the User Experience: A Practitioner's Guide to User Research" Amazon link
- "The Design of Sites: Patterns for Creating Winning Web Sites (2nd Edition)" Amazon link
- "Waiting for Your Cat to Bark?: Persuading Customers When They Ignore Marketing" Amazon link
- "Call to Action: Secret Formulas to Improve Online Results" Amazon link
- "Persuasive Online Copywriting: How to Take Your Words to the Bank" Amazon link
- "The Design of Everyday Things" Amazon link
- Agile Development
- Up and Running, the company I founded, believes in the Agile methodology for development and deployment of software because of its focus on the customer and the customer's feedback.
- I've used more standard forms of development life cycles, such as the waterfall methodology. I recognize the importance of following the customer's preferred means of doing work, and am able to adapt to any software development methodology.
- I believe this methodology is responsible for Up and Running being able to deliver products that customers use and like.
- Due to the high levels of customer and user interaction I incorporate within my projects, my systems are well received. I believe in user-focused, interface-driven software development because if the system is not usable, it will not be used in most cases.
- Testing:
- I believe in using a test process that involves all stakeholders: primary programmers, peer programmers, quality assurance specialists, project managers, customers, and, most importantly, the user.
- I believe developers should use these tools and methods to test their work: Peer Reviews, Unit Testing, Automatic Build Testing, Test Cases, User Story Confirmations, User Interface and Navigation Checklists, and Development Completion Checklists.
- I think project managers and quality assurance specialists should use these tools and methods to check a developer's work: Test Cases, User Story Confirmations, and User Interface and Navigation Checklists.
- I use tracer bullet development (Hunt, Andrew and David Thomas. The Pragmatic Programmer: From Journeyman to Master. Addison Wesley, 1999.) on larger scale systems to retrieve faster feedback from customers on UI designs and system workflows. Amazon link
| My book review
- Besides the book mentioned above, I have read these books on the subject:
- "Agile Retrospectives: Making Good Teams Great" Amazon link
- "Practices of an Agile Developer: Working in the Real World" Amazon link
- "Release It!: Design and Deploy Production-Ready Software" (Not directly related to Agile development, but it's applicable.) Amazon link
- "Ship it! A Practical Guide to Successful Software Projects" (Not directly related to Agile development, but it's applicable.) Amazon link
- Besides the above, I have read books on estimation, managing software developers, and the requirements gathering process.
- "Agile Retrospectives: Making Good Teams Great" Amazon link
- Testing or Quality Assurance (QA):
- This is a topic that I’ve spent some time thinking about and taking action regarding. This section relates mostly to development-focused testing, not user testing.
- I’m used to working with QA professionals and without them. The approach just depends on the context of the project and its stakeholders.
- My current overall stance on Quality Assurance is that it’s a cost/benefit decision. Even though QA can receive infinite amounts of money, it may not result in perfection. As a thought experiment, please think about the software companies that produce defect-free software. None, right? If billion-dollar companies can’t prevent this, individual developers certainly can’t. In the areas where errors are not acceptable or more costly from a liability perspective (healthcare software, defense systems, automotive systems, security systems, flight systems, etc.), QA is fulfilled by teams dedicated to this purpose that are comprised of professionals who have built their careers within this specific segment of the software development world. These companies/projects always pay for QA all the time for everything because that’s cheaper than paying for downstream problems (in addition to being the ethical thing to do for the aforementioned examples). In my experience, web software is different in that most people and companies do not want to pay for QA all the time throughout. I believe it is more cost effective to pay downstream in most cases, but it is the Client’s decision to make. I just ask that the Client let me know how they want to approach testing.
- I have worked with classified information, nuclear facility data, United Nations data, financial data, healthcare data, and millions of records of personal information. In these environments, there are top-quality procedures that I’ve experienced, used, and contributed to.
- If warranted, I believe in using a complete test process for all aspects of the application that involves all stakeholders who can contribute: primary programmers, peer programmers, quality assurance specialists, systems administrators, security experts, project managers, customers, and, most importantly, the user. That is, testing is not a single person’s job, it’s everyone’s job. Ideally, it’s built into the culture, how people think and act by default.
- I believe developers should use effective tools and methods to test their work, the ones that make sense for the project and the people on that project. Some examples are: Peer Reviews, Unit Testing, Automatic Build Testing, Test Cases, User Story Confirmations, User Interface and Navigation Checklists, and Development Completion Checklists.
- I think project managers and quality assurance specialists should use these tools and methods to check a developer's work: Test Cases, User Story Confirmations, and User Interface, and Navigation Checklists.
- I’ll typically approach testing in one of two general ways depending on the state of the project:
- If it’s a greenfield project (new, from scratch), I’ll start off with unit testing and proper breakdown of modules/classes so the system is easy to test going forward. I’ll roll in functional tests as use cases are defined and if the Client is interested in that level of testing.
- If it’s a project-adoption scenario, then it becomes more of a challenge. The main issue is determining what state the code is in and to what level of quality the previous developer(s) performed testing. For some projects, it’s been very easy to roll in unit testing, while for others it was financially impossible for the Client to do. In those cases, where it doesn’t make sense to retrofit unit tests, we can at least do functional testing. It’s not as targeted, but it does give us clarity if something unexpected happens, especially protecting against regression errors.
- I have more information about testing as it relates to PHP in my PHP Testing section
- Leadership
- I have helped grow Up and Running from a single person in 1995 to an organization that that has 60+ employees, and serves customers across the world.
- I recognize that leadership is setting the direction, and that management is ensuring that direction is fulfilled efficiently.
- I know the relationship between Vision, Mission, Values, and Strategic Goals.
- At the start of software projects, I like to ensure that the vision and mission of the project are in direct alignment with the organization's vision and mission.
- I think it's important that people and projects have their own organization-focused vision, mission, and goals.
- I have read books by Jim Collins, Peter Drucker, Stephen Covey, and Tom Peters on leadership and management topics.
- Management
- I believe a manager needs to set the stage for success, and do whatever is needed to help team members achieve success.
- I have learned how to assess potential team members, as well as ways to quickly learn if someone is capable.
- I have hired and fired people; though I don't enjoy the latter at all, I view it as a responsibility if the person is not improving after being given the opportunity to do so once given objective, constructive feedback.
- I have led, coached, and mentored other team members, and enjoy learning from my team members, my team lead, and all other peers.
- I have read materials on how Microsoft ("How Would You Move Mount Fuji?" ( Amazon link
)) and Google (online) hire, and have created two interviews to test for developer aptitude.
- I've written template job descriptions and responsibilities for organization webmasters.
- The Agile books I've read help me to be a better manager operationally.
-
Programming is more than just knowing the language of choice. It’s also important to understand the mindset of the developer and how they approach problems. When I’m validating and assessing programming talent, I don’t focus the conversation around the particular language or technology, but instead inquire more about the meta-aspect of programming that the person possesses. That is, can they think like a programmer, or are they just able to assemble and apply the tools that frameworks give them? Examples:
- How do they tackle problems? Do they look at the problem just in isolation of the task presented or do they have the ability to step back to assess the bigger picture and understand how the work could tie into the whole? When they present a solution, are they only able to within the context of their primary framework(s)?
- Can they review how a feature should work at a number of different angles? When they’re evaluating bounds checks on inputs, do they do a good job of covering even weird or outlandish inputs? This is an important skill for writing test cases that have good code coverage and ensuring that an application doesn’t get into an invalid state.
- What does their code architecture look like? How SOLID are they? It’s easy to write code quickly without considering this, but it’s much more difficult to write code that holds up to the test of time and adapts to inevitable downstream changes in a project. (This results in the creation of an asset, not a liability.) Are they good at minimizing the technical debt that a project can incur?
- How complete are the typical solutions they produce? Am I getting code that looks like it is test code or prototype code or something that has been thought through and likely has been through a refactoring process to make it better? Do tests exist for the code (a great sign that the developer is focused on taking care of the code and ensuring that it works for the long-term in an efficient manner)? Was what was done and not done introduced well so that expectations are managed (it’s fine not to do everything so long as it’s intentional, and you know what you would do with more time)?
- Team Work
- The foundation of a project's success is its people and the relationship between those people. It's important for the team members to have a shared goal, the tools to do their job, the knowledge to do their job, and respect and trust for one another.
- I have been working with customer teams since 1995 to deliver technical solutions.
- I have experience working with remote contractors and developers using various tools to manage projects, time, and collaboration.
- I have led, coached, and mentored other team members, and enjoy learning from my team members, my team lead, and all other peers.
- I am one of the owners in a company with the other owners living in Michigan and California; I'm used to working remotely, and can deliver results doing so.
- Interpersonal and Oral/written Communication Skills: I hope the letters of recommendation attest to my abilities in these areas. English is my primary language.
- Analytical Abilities:
It's hard for me to demonstrate these in writing, but I'll attempt to. I have taken personality type tests before for previous employers, and those always result in labeling me an ENTJ. I think it's mostly accurate. "The[y] are assertive, innovative, long-range thinkers with an excellent ability to translate theories and possibilities into solid plans of action." http://www.personalitypage.com/ENTJ.html
(I think I've addressed the potential weaknesses of this personality type through awareness.)
- Customer Focus
- I work closely with the client stakeholder team to understand requirements, system specifications, and design. This is standard operating procedure, and something I believe is very important for the success of a project.
- For our requirements gathering sessions, I generally approach them using a workflow-based framework as defined in this document, "Up and Running -- Workflow Analysis Introduction.pdf"
. This is part of our overall solution design methodology, as defined in, "Up and Running -- Solution Design Methodology.pdf"
.
- I also maintain the perspective that it's my responsibility to ensure the system does what the end users want. I don't believe it's easy to define requirements, and this is why the Agile programming methodology is so appealing to me; it allows the customer to see how the system is progressing and to offer feedback at every iteration. The end result is a very nice product because it does what the users want and how they want it, meaning that the system will actually get used and not sit on a shelf.
- Business and Technical Communications:
My degree in Management of Information Systems gives me a unique perspective in that I understand business operations and software development operations. This allows me to bridge the two worlds well, meaning I'm very helpful at translating end user or business line requirements into functional or software development requirements. In other words, I can talk business and "geek". Examples:
- Since 1995, I've worked with several hundred clients across the nation, helping them translate business needs into technical or functional requirements, depending on the type of technology solution needed.
- At one client's, I wrote and implemented an entire Manufacturing ERP to manage their manufacturing processes.
- At another, I implemented a system to lend more transparency to their project workflows in addition to streamlining payroll and customer invoicing processes.
- At another, I developed software to improve the quality of product delivery throughout all post-order operations management.

Some Personal Information
- Short History
- I grew up in Hancock, MI, which is in the Upper Peninsula (U.P.) of Michigan. It's known for its snow, beautiful scenery
, and yoopers (wiki)
.
- I met and married my lovely wife, Kate Hanson — an Alaskan who went south to the U.P. for school, and she brought me to Missouri soon after so she could pursue her career as a teacher and be closer to family. We lived in Webb City, MO for about 7 years, and I didn't miss the snow the entire time.
- Erik Odin Hanson joined the family on 4/29/08 at 8:05am.
- Alora Rose Hanson was born on 12/28/09 at 9:12 am.
- My brother lives in Arizona, and the rest of my immediate family still lives in the U.P.
- As of June 2011, we live in Ames, IA because it is close to family and friends. If you like corn and a great community, Ames, IA is a wonderful place to be. (I can travel for work.)
- I think I'd be remiss if I didn't explain a little about my first piece of hardware I used a great deal and the first software program I wrote:
- First piece of hardware: This was a 486SX 33 MHz with 4 Megs of ram. It ran DOS 6, and didn't have Windows. I had a 14,400 modem as well, which made browsing the BBS' in the area nice and fast. With that, I could download a 200K file in about 3 minutes. For the first two years of my computing experience, I just used a DOS environment. I was a teenager when I got this machine, and it was near top of the line then.
- First software program: I suppose this would be a batch file that used decision logic and a menu structure. The neighborhood kids would come over to play games on the 486SX computer described above. Since they didn't know DOS commands, they would constantly ask me to start a new game for them. That got tiring so I wrote a batch file that would present a menu of games that they could play. After a selection was made, it would modify any system parameters needed to play the game, and launch the application. Saved me a lot of time. : )
- I grew up in Hancock, MI, which is in the Upper Peninsula (U.P.) of Michigan. It's known for its snow, beautiful scenery
- Hobbies
- Tennis (3.5 on the USTA NTRP scale).
- Woodworking and, in general, fixing things.
- Figuring out how to completely automate my house with Arduinos and Raspberry PIs.
- Cooking and experimenting in the kitchen.
- Playing card games, board games, RTS games, first-person shooters, and puzzle games.
- Working with and experimenting with computers and software, mainly in open source technologies.
- Reading, mainly fiction, but also computer manuals and information on new technologies or subject matter on the projects I'm working on.
- Miscellaneous
- My friends call me "Smiling Pete".
- I work with people throughout the world with my current battle station / command center: 2015 Mac Book Pro and a 30" and a 27" monitor (Scott Adams, author of Dilbert, states in one of his books, "An engineer who is surrounded by machines is never lonely and never judged by appearance. These are friends.") Here’s my previous setup (> 80” of viewing area).
- I use an IBM Model M Keyboard. : ) (wiki)
- I own nearly every Dilbert book.
- I have a Black Belt in Tae Kwon Do, and currently teach (pro bono) at the local Boy’s and Girl’s Club. (So worth it!, but I went through more paperwork and background checks to do this than for my mortgage.)
- I have two children, and though I once had to trade in my Jeep Wrangler for a minivan (please, no jokes), I was able to get another Jeep! (Still have the minivan though. Not every story has a happy ending.)

Final Thoughts
I would be honored to be able to work with you to accomplish your web-based goals.
If you have questions or would like to talk about anything, I hope you'll please call me at anytime
(Cell: 906-281-1178). Thank you for taking the time to read about the opportunities I've been given to help
others and the technologies learned along the way. I hope I'll be given the chance to prove myself to you.
Respectfully yours,
Pete Hanson
906-281-1178
Peter@pkhanson.com
Respectfully yours,
Pete Hanson
906-281-1178
Peter@pkhanson.com
Software should yield returns and be accepted by the end users, and I believe
the recipe for success starts first with people, process next, and technology last.
I'd be pleased to help you navigate this complexity.
the recipe for success starts first with people, process next, and technology last.
I'd be pleased to help you navigate this complexity.