2019 Lean Martech stack for startup validation

As part of the antler.co June 2019 cohort, a lot of us are trying to validate startup ideas in a lean fashion. This article outlines what worked for 3 of us when were trying to do customer development in a lean fashion, and avoided building anything by leveraging existing SaaS products in the market and wiring them up smartly with some elbow grease.

Components of a lean martech stack

First up, what are the components of this stack?

  1. Top of the funnel: Collect a bunch of emails in your target market, or run facebook ads to get people to give you their emails or scrape LinkedIn (warning: this is against LinkedIn’s terms, and you could get your account banned if you’re not careful)
  2. Outbound cold emailing tool / Facebook ads manager: Send a bunch of these emails out to our prospects and have a smart and automated way of following up with them to get them to do something small like fill in a short survey or agree to catch up on a call with you.
  3. A landing page or a survey tool for users to register their interest: You could set up a landing page with enough details to intrigue them into registering interest for a longer call or if you’re running ads then a way for them to give you their details.
  4. A way to collect this information from your website and dump it in a central database: Typeform has slack integrations that send you messages to a private group or an individual with the pro plan, or use a tool like zapier to wire up the landing page tool to a google sheet.
  5. A way to measure the funnel: A funnel analytics tool like mixpanel.com or heap.com or even google analytics’ funnel feature.

What tools comprise a lean martech stack

Top of the funnel:

  • LinkedIn is your best friend if you’re targeting a B2B idea. Use inmail, or a linkedin prospecting tool at your own risk 🙂
  • Facebook ads could be useful for either B2B or B2C ideas to find a lot of your target market
  • Try and find targeted facebook groups, and see if they’d be willing answer survey questions. Don’t spam or break individual group rules and see if you can add value to the group by your domain knowledge, and thereby getting the members of the group to respond to your survey
  • See if there’s online forums on the web on your area of interest, and do the above there.

Outbound cold emailing tool

If you’re doing cold outreach emails, consider a tool like:

My advice would be to just pick one and run with it. Re-evaluate after a few campaigns in terms of opens, clicks and performance and then move the to next tool. I love lemlist.

These tools allows you to connect to an existing GMail account via the GMail API and send emails on your behalf. Consider creating a new email ID for every idea you’re validating so it’s on brand. If you want to go one step further, setup Google Apps enterprise on a domain, and then use that as a way to send emails. I haven’t seen any issues with using the free email service from google. There are limitations to how many cold emails you can send otherwise you’d end up being blacklisted by ISPs and put into spam.

Airtasker surveys and other methods

You could also do:

Landing page tool and a way to store info in a database

There’s 1000s of landing page tools out there, just get carrd.co pro – it’s the easiest way to set up a landing right now and use zapier to connect to a google sheet.

A Way to Measure the Funnel

Again, there’s tonnes of tools that do this. Just use heap, and they have a 30 day free trial after which you can write to them to get a startup plan. Or you could google analytics and use it’s funnel feature.

What does the funnel look like?

Image to come

Running sendy on nginx – sendy nginx rules

Sendy lets you send newsletters for very cheap. It’s a self-hosted script that works off of Amazon AWS.

Sendy's a marketer's wet dream

Sendy’s a marketer’s wet dream

Sendy comes with instructions for setting up on apache. I had to recently setup sendy on a subdirectory called “mailer” on nginx.

These are the rules I came up with:

The important parts to consider are these lines which let sendy operate flawlessly:

 

How to turn off auto indent when pasting stuff into vim on the terminal

You can turn on automatic indenting in vim by setting the indentexpr option. You can turn it on by adding the below line to your ~/.vimrc.

While automatic indentation is almost the next best thing to baked bread, especially when you code on a remote server, it’s also painful when you want to paste something from your local machine’s clipboard, vim goes overboard and indents code like this:

Vim auto indenting

Vim auto indenting

In order to be able to paste stuff from clipboard into a file open in vim via a terminal without vim auto indenting it badly, you need to enable the paste mode.

By typing

you can enable the paste mode, which will let you paste stuff without vim indenting it multiple times

By typing

you can go back to the auto-indent mode.

This can also be toggled easily by using the following command:

Now, it can be a little bit of a pain to remember to toggle between the paste mode on and off. A good samaritan has written a vim pathogen bundle that makes this automatic.

Vim Bracketed Paste plugin lets you easily paste stuff without having to toggle paste modes.

Installing vim bracketed paste plugin is easy:

  1. Install pathogen first.

     
  2. Install vim bracketed paste

     

Hope this helps!

Excluding symlinks while grepping

When grepping for a term, I usually face troubles with having to wade through results that are duplicated because symbolic links on my codebase. Excluding symlinks while grepping would save a lot of time. Let’s look at how to achieve that.

In order to exclude symlinks on search results of grep, I use a combination of find and grep like below:

The “-H” modifier on grep prints the path to the file along with the filename in the search result.

An useful thing to do would be to ignore the .git directories which can sometimes also spoil the search results. In order to do that, you could also pipe the output of the first find command to another grep that ignores the .git folders, but it’s a little slower than using the find command solely.

The below find command does it faster than piping it to grep when you want to avoid piping it to grep.

 

 

 

Export MySQL query results to CSV from CLI

This post is mostly for me to be able to quickly look up the way to export MySQL query results to CSV from command line 🙂

Assuming there’s a users table with name and email as the columns, and you can get all users using this command:

You can export this data from the command line to a csv file like below:

The command run above generates a “tab separated file” or “tsv”

In case you want an Excel compatible csv, run this command:

The above query will produce a file with rows like these:

Couple of nits:
1) MySQL should have write permissions to the directory in which you’re trying to write the file.
2) There shouldn’t already be a file with the same name – which is a bit of a bummer when you’re iterating over queries and are trying to create the right version of the file.

Automating FTP transfers on Linux

I was recently working on moving a lot of files between servers, but was hugely constrained by the fact that the servers had only FTP to work with. Even though I understand that it’s not secure AT ALL, there was nothing I could do but work on automating FTP transfers on linux. For FWIW, if you have the choice, DO NOT use FTP to transfer files between servers. You can use tools like rSync which make this process easy. Now that the disclaimers are out of the way, let’s dive in.

Any FTP client basically operates by issuing a bunch of commands to the remote servers in a sequence. The basic idea is like this:

FTP CLIENT> Open connection to server
FTP CLIENT> This is my username and password (notice how it’s not encrypted in any way, and thus is open to getting sniffed by any one monitoring the traffic between your machine and the destination machine, which is why it’s insecure)
FTP CLIENT> Move to directory of interest
FTP CLIENT> Issue a “get” command to download files to your local computer
FTP CLIENT> Finally say “KTHXBAI”. 🙂

Let’s look at a file that issues these commands and assume it’s saved as ftp_commands.txt

Now let’s use the ftp command and actually run it:

The -n modifier is the one that does the trick here. It actually prevents ftp from attempting to automatically login upon issuing the “open” command. In this mode, ftp commands expects the user name and password, which we feed to it right after the open command and we’re able to login to the server.

Get more productive with SSH using tmux

I’m sure all of us who’ve used SSH to manage a server have faced these issues at some point:

  1. Process that’s running gets killed if the internet gets disconnected
  2. Multi-tasking on the server (like reading log files and debugging some code at the same time) is a pain
  3. No way to save and retrieve work.

Enter tmux. Tmux is a terminal multiplexer. What’s that? In simple words, it lets you have multiple tabs open on a server. You can move around tabs, long running processes don’t get killed and what’s more, you can pick up just where you left off the last time.

Tmux is very memory efficient and it will run as long as the server doesn’t reboot. Tmux is also useful for pair programming, debugging something with your buddy, etc. Multiple people can join the same tmux and see what’s happening in the server.

I am very fond of tmux and use it on all the servers that I manage. In fact, it’s the first thing I configure on any server.

tmux session

Tmux is very useful when you have to manage remote servers

Tmux is also very configurable and so you can change it the way you want it. If you spend a lot of time on SSH managing servers, I would recommend you to go ahead and spend time learning tmux and how to use it. You can thank me later! 🙂

Here’s my tmux.conf: https://github.com/ckailash/dotfiles/blob/master/.tmux.conf

It comes with these shortcuts:

Create new window: alt + n
Move between windows: shift + left arrow (to move left) or shift + right arrow (to move right)
Kill a window: ctrl + d
Rename a window: alt + r
Move a window: shift + up arrow (to move right) or shift + down arrow (to move left)
Detach from tmux: ctrl + b (press and let go) then d

I also add some aliases to my .bashrc so it’s easy to deal with tmux

Once you’ve added the above to the .bashrc file and sourced it, you can use tmux really well

Opening a new tmux session: tm
Listing all the tmux sessions: tl
Joining an existing tmux session: ta

Hope this helps!

Don't you feel like pulling your when outsourcing projects?

14 questions you should ask when outsourcing a tech project

Finding trustworthy, hard working and quality offshore development teams is a tough job. Hiring the wrong people can be disastrous for your business and a waste of your time and money. When building a technology product, it might be tempting to go with an offshore team as they’re cheap, but it also comes with a lot of headaches like:

Outsourcing headaches:

  1. Communication problems
  2. Timezone issues
  3. Making them understand the business and making sure the right product is being built
  4. People who code are (most of the time) doing it for the money so they don’t care much for the product.

Don't you feel like pulling your when outsourcing projects?

This is what it feels when you outsource a project, but with questions detailed below, you can get your sanity back and make an informed decision

With that said, I’ve found that with the right kind of background research and vetting, you can find talented and motivated teams. Before you give out any money, make sure to evaluate them like you would evaluate any local talent that you hire.

Here are some questions I’ve used in the past to unearth quality teams:

Questions to ask your remote development team before hiring them

  1. Can you show us the code samples of the programmers working in the project?
  2. How many people will be on the team for my project?
  3. Will they be shuffled around projects or assigned permanently?
  4. How many projects have the programmers who worked on your team have done?
  5. How long have they been employed?
  6. What’s your attrition rate? (for companies > 200 people)
  7. What’s the experience of the project manager? Can I speak to them before starting off and get an intro and spend some time with them? (So we can get an idea of the communication skills)
  8. Do you use version control? Code reviews? (Actually it’d be worth evaluating how many points they score on the 12 factor app test.)
  9. If you follow agile – what are the sprint deliverables for each week? When is the sprint review meeting? Is this intimated to us at the start so we can follow progress easily?
  10. Do you use a task management board to track progress? (Eg. Trello, Asana)
  11. How does your team communicate with clients? Skype? Slack?
  12. What’s your general development process? What do you do first?
    • When does database schema design happen?
    • Is design done first then the programmers start integrating or does it happen parallely? How do you make sure everything is responsive? Will you use twitter bootstrap or some such framework?
    • How do they handle db changes? Do they use migrations? (phinx or http://drarok.com/ladder/)
    • How do you deploy code usually? Do you pull from git? Do you use deployment tools like Ansible, Puppet?
    • Can you show examples of previous work you’ve done and is it possible to read the source code to check the quality? (Could be tricky as they may not have projects where they have exclusive rights to the source code, and may not be able to share it with you and the alternative will be to give them a small piece of work and evaluate as in point 14)
  13. How do your programmers deal with roadblocks? Are they talented enough to invest time on R&D and learn? If they lack that expertise, do you have dedicated R&D teams that do this?
  14. Are you ok with doing a small proof of concept work for a fee so we can evaluate your work? Can you try doing a proof of concept streaming with lighttpd, apache streaming modules as a way to show us you can handle this project? (We’ll pay you a flat fee of $200 for this)

Outsourcing can be a huge money saver if done right, like any business relationship, you have to think of the other party and their motivations as well.

Ubuntu ssh-agent autostart on login

ssh-agent provides a way of securely storing private keys for SSH applications. This means, all applications relying on SSH on the underneath like git or svn can be automatically the keys when you want to interact with a private remote code repository.

Many a time, I have been stumped by the ssh-agent not already running on my VPSes from Digital Ocean and Amazon when I’m logged in via SSH.

I would manually start the ssh-agent with the below command

And then manually add all the keys to the ssh-agent by running

We can verify whether the keys were added by typing

I had to do this every other day, and wanted to automate it. In the process, I landed upon this piece of shell code to automate this process.

Once this is added to your .bash_profile, every time you open a terminal, ssh-agent is started if it’s not running already and then all the key files are loaded into it.

For this to be applied to the currently running terminal, you need to run

Hope this helps!

Converting a static website into a wordpress website

I was recently working on a project that required me to convert a static HTML based website into a wordpress site. After finishing the layout and the theme work, I was facing the task of manual copy paste labor. Determined not to waste my time, I was looking around for solutions that crawled a static website and output HTML pages. Tried my hand at Python using BeautifulSoup but it didn’t go anywhere meaningful.

After some fiddling around I found this plugin for wordpress, called HTML Import 2 that lets you import content from static html files. All I had to do now was to create those static HTML pages. As I was searching, I found a beautiful solution that required no additional software on my Ubuntu.

wget. I was astonished at what this command could do, for I have always used it for only downloading a single link.

wget -r –level=0 –convert-links –page-requisites –no-parent  <url> downloads all pages in the site, forms the directories for images and other javascript, etc as in the original site and you have a perfect copy of the site locally. Some extension renaming, and twiddling around later, I had the converted wordpress site ready.